Primecuts – This Week In The Journals

July 31, 2012

By Benjamin P. Geisler, MD, MPH

Faculty peer-reviewed

July 28th was World Hepatitis Day. Just three days prior, a group of researchers from Johns Hopkins published a paper in JAMA on HCV/HIV co-infected patients.[1] This study demonstrated an independent correlation between hepatic fibrosis stage and a composite endpoint of end-stage liver disease, hepatocellular carcinoma, or death. As the authors note, these results support starting highly active antiretroviral treatment (HAART) for HIV in “most coinfected” patients.

Also last week, the 19th International AIDS Conference took place in Washington, D.C. Speakers included Hillary Rodham Clinton, Laura Bush, Bill Gates, and Elton John. Coincidentally, The Lancet started a series of papers on HIV-positive men who have sex with men [2] and JAMA also featured the 2012 Recommendations of the International Antiviral  Society-USA Panel  for the treatment of adult HIV patients based on systematic PubMed and Embase searches.[3] The recommendations include offering HAART to all patients regardless of CD4 count while monitoring not just CD4 count and viral load, but also engagement in care, HAART adherence and drug resistance. The recommended initial regimens are two nucleosides/nucleotides (tenofovir/emtricitabine or abacavir/lamivudine) plus a nonnucleoside reverse transcriptase inhibitor (efavirenz), a ritonavir-boosted protease inhibitor (atazanavir or darunavir), or an integrase strand transfer inhibitor (raltegravir).

In clinic, you might be asked about Preexposure Prophylaxis for HIV Prevention (PrEP), another hot topic these days. Krakower and Mayer reviewed PrEP in an on online-first Annals article.[4] They conclude that PrEP does not provide complete protection (44 to 75% depending on the setting). This narrative review underlines a currently controversial option for prevention. Importantly, there are no “real world” data yet. If you need to know more, I recommend reading the entire article, for this is a topic that will continue to be heavily discussed.

Meet the newest targeted therapy on the oncology block: dabrafenib for BRAFV600E-mutated metastasized melanoma. Hauschild et al. published a randomized controlled trial [5] in The Lancet with a total of 250 previously untreated stage IV or unresectable stage III melanoma patients. The trial compared dabrafenib to a standard chemo regimen (dacarbazine) and was non-blinded and randomized in a 3:1 fashion, an often recurring trial design for new oncology agents thought to have great potential. After a median follow-up of 4.9 months, there were not enough deaths in either group to calculate median overall survival (mortality 11 vs. 14%; hazard ratio [HR] 0.6 with a 95% confidence interval [95%CI] of 0.3-1.5). Median progression-free survival was 5.1 vs. 2.7 months (HR 0.3; 95%; CI: 0.2 to 0.1).
Dabrafenib selectively inhibits the gene product of the BRAFV600E mutation, a supposedly oncogenic variant of the B-RAF kinase. This mutation was present in 48% of the screened patients in this trial (own calculation). However, there is already a targeted therapy for the exact same indication, vemurafenib, which in contrast to dabrafenib is already FDA-approved. The pivotal trial [6] comparing vemurafenib to the same chemo as the present trial, demonstrated 84 vs 64% overall survival (HR: 0.4; 95% CI: 0.3 to 0.6) and median progression-free survival of 5.3 vs 1.6 months (HR: 0.3; 95% CI: 0.2-0.3). Taken together, the results of these two phase-III trials warrant a head-to-head trial or at least an indirect treatment comparison meta-analysis. However, neither agent might provide more than a temporary relief as tumor cells might, analogous to the HIV virus, adapt to the therapy.

Staying in Europe and on the same topic, a systematic review with meta-analysis in BMJ estimated the burden of cutaneous melanoma that can be attributed to tanning salons.[7] Boniol, Autier, and Boyle pooled the results of case-control, cohort, and cross-sectional studies and found a relative risk of 1.2 for squamous cell carcinoma, and 1.1 for basal cell carcinoma. They fitted a mixed model to quantify the effect of each indoor tanning session per year, which came out as almost 0.02 additional melanomas per session. Initiating indoor tanning before 35 years of age almost doubled the risk. The attributable burden of indoor tanning in Europe was estimated as 3,418 cases per year – 5.4% of all melanoma cases. What this study adds to previous research is proof of a dose-dependent relationship and that age of initiation seems to play a role. In conclusion, sun-deprived young doctors and everybody else should refrain from using sunbeds.

Two studies in this week’s Archives highlighted the importance of taking a social history; they investigated the role of loneliness in functional decline and death in a nationally representative study of people age 60 and older and the effect of living alone at age 45 years and above on the cardiovascular event and death risk.[8, 9] In the two studies, 43% reported feeling lonely and 19% were living alone, respectively. Unsurprisingly, loneliness was a predictor of functional decline – less activities of daily living, and same or less mobility – and death. Living alone was associated with higher four-year mortality (14 vs 11%; p<0.01) and cardiovascular death (9 vs 7%; p<0.01). However, living alone was only associated with increased mortality among middle-aged and not among elderly patients.

Next, Primecuts heads to the emergency department: Mass General’s cardiac imaging program published the results of a multi-center trial of coronary CT angiography for low-to-intermediate risk acute coronary syndrome (ACS) in this week’s New England Journal.[10] ED patients with signs and symptoms suggestive of ACS but without a history of coronary disease, acute ischemic EKG changes, or initially positive troponins were randomly assigned to either coronary CT angiography or standard of care. The comparator strategy may or may not include cycling troponins, repeating rest or doing a stress EKG, treadmill testing, other advanced imaging such as SPECT, referring for a conventional angiography, or just “watchful waiting” which occurred in 2% of CT and 22% of standard of care patients. The study’s endpoints were length of stay, disposition, cardiovascular complications at day 28, and costs. CT angiography was associated with a shorter stay (mean 23 vs 31 hours; p<0.001), shorter time to diagnosis (10 vs 19 hours, p<0.001), and higher discharge rates from the main ED (most patients in the comparison group left from the observation unit, which was not defined as belonging to the ED or to inpatient wards). Interestingly, there was no statistically significant difference in costs ($4,289 vs. $4,060; p=0.65). ACS was only confirmed in 2% of cases. An editorialist sees no role for CT-based angiography for this patient group, in fact sees no role for testing in the ED or observation at all.[11] Short-fused New Yorkers demanding a CT for faster discharge should be informed about the additional radiation of CT angiography.

One last thing: In a special paper in this week’s New England Journal, a group at Harvard examined whether mortality rates changed after a Medicaid expansion in Arizona, New York, and Maine in 2001 and 2002.[12] They compared mortality rates of these States from the Centers for Disease Control and Prevention to four neighboring States without any changes in their Medicaid eligibility (Nevada and New Mexico combined, Pennsylvania, and New Hampshire). Over five years post-enactment (compared to five years prior), Medicaid’s enrollees increased by almost one forth or 2% of the overall sample of childless adults aged 19-64 years. Associated with this was a 6% decrease in mortality rates. The decrease was strongest in non-whites, in people 35 years and over, and in counties with higher levels of poverty. While the mortality rates decreased in New York, there was no statistically significant difference in Arizona and Maine. Comment: this study was obviously motivated by the Affordable Care Act’s intended Medicaid expansion which, as the Supreme Court ruled, cannot be forced upon the States by threatening to cut their existing Medicaid funding. What the study teaches us is that Medicaid expansion might, on the whole, reduce mortality rates, but that your mileage will be likely to vary from state to state.

Benjamin P. Geisler, MD, MPH is an intern in NYU’s Categorical Residency Program in Internal Medicine

Peer reviewed by Danise Schiliro-Chuang, MD Assistant Professor of Medicine and Contributing Editor Clnical Correlations,

Image courtesy of Wikimedia Commons


  1. Limketkai, B.N., et al., Relationship of liver disease stage and antiviral therapy with liver-related events and death in adults coinfected with HIV/HCV. JAMA, 2012. 308(4): p. 370-8.
  3. Thompson, M.A., et al., Antiretroviral treatment of adult HIV infection: 2012 recommendations of the International Antiviral Society-USA panel. JAMA, 2012. 308(4): p. 387-402.
  4. Krakower, D. and K.H. Mayer, What Primary Care Providers Need to Know About Preexposure Prophylaxis for HIV Prevention: A Narrative Review. Ann Intern Med, 2012.
  5. Hauschild, A., et al., Dabrafenib in BRAF-mutated metastatic melanoma: a multicentre, open-label, phase 3 randomised controlled trial. Lancet, 2012.
  6. Chapman, P.B., et al., Improved survival with vemurafenib in melanoma with BRAF V600E mutation. N Engl J Med, 2011. 364(26): p. 2507-16.
  7. Boniol, M., et al., Cutaneous melanoma attributable to sunbed use: systematic review and meta-analysis. BMJ, 2012. 345: p. e4757.
  8. Perissinotto, C.M., I. Stijacic Cenzer, and K.E. Covinsky, Loneliness in older persons: a predictor of functional decline and death. Arch Intern Med, 2012. 172(14): p. 1078-84.
  9. Udell, J.A., et al., Living Alone and Cardiovascular Risk in Outpatients at Risk of or With Atherothrombosis. Arch Intern Med, 2012. 172(14): p. 1086-1095.
  10. Hoffmann, U., et al., Coronary CT angiography versus standard evaluation in acute chest pain. N Engl J Med, 2012. 367(4): p. 299-308.
  11. Redberg, R.F., Coronary CT Angiography for Acute Chest Pain. N Engl J Med, 2012. 367(4): p. 375-6.
  12. Sommers, B.D., K. Baicker, and A.M. Epstein. Mortality and Access to Care among Adults after State Medicaid Expansions. N Engl J Med, 2012.

Primecuts – This Week In The Journals

July 23, 2012

Jennifer Lee Dong, MD

Faculty Peer Reviewed

As we approach the Opening Ceremonies in London this weekend, there has been a great focus on wellness in recent health news. The New York Times reported that the FDA approved a new weight loss drug Qsymia, a combined pill of an appetite suppressant and topiramate [1]. Read on for more breaking news about weight loss and physical activity.

First, though, a theme in medical journals this week surrounds the previously controversial recommendations for prostate cancer screening. In this week’s issue of New England Journal of Medicine, Wilt et al. conducted a clinical trial in which 731 men with localized prostate cancer, mean age 67 years, were randomized to either radical prostatectomy or observation and followed for 10 years [2]. The patients had a median prostate-specific antigen (PSA) level of 7.8 ng/mL (mean 10.1). The primary outcome was all-cause mortality with a secondary outcome of prostate cancer-specific mortality. Patients in the observation group were offered palliative treatment or chemotherapy for symptoms; ultimately, 36 men in this group underwent radical prostatectomy.

Of the 364 patients in each group, 171 (47%) in the prostatectomy group and 183 (49.9%) in the observation group died (hazard ratio=0.8, 95% CI 0.71-1.08, p=0.22) with an absolute risk reduction of 2.9%. Patients who received radical prostatectomy survived for a median of 13 years compared to 12.4 years for those simply observed. For patients with PSA levels less than 10 ng/mL, there was no significant difference in outcomes. However, for patients with PSA levels greater than 10 ng/mL, surgical prostatectomy reduced all-cause mortality by 13.2% (HR=0.67, 95% CI 0.48-0.94). The authors concluded that radical prostatectomy did not cause a significant reduction in all-cause or prostate cancer-specific mortality compared to observation alone. However, patients with high-risk tumors who receive prostatectomy may have slightly improved outcomes [2].

Along similar lines, the Annals of Internal Medicine published the new U.S. Preventive Services Task Force (USPSTF) recommendations for prostate cancer screening [3]. These recommendations replace the 2008 USPSTF guidelines for prostate cancer screening with a grade D recommendation against routine PSA-based screening for prostate cancer. The authors explain that PSA testing may lead to overdiagnosis, detecting many cases of asymptomatic prostate cancer, and/or pseudo-disease, meaning men with asymptomatic cancer as detected by PSA screening have tumor that either does not progress or progresses so slowly so as to not significantly impact mortality. PSA testing often yields false positives which can lead to undue worrying about prostate cancer. In addition, positive PSAs often result in a prostate biopsy, about a third of which are associated with adverse effects such as pain, fever, bleeding, infection, or transient urinary difficulties. Ninety percent of men with PSA-detected prostate cancer in the U.S. receive early treatment with surgery, radiation, or androgen deprivation therapy, all of which can lead to long-term adverse effects including urinary incontinence, erectile dysfunction, bowel dysfunction, gynecomastia, and/or hot flashes. Thus, there is a convincing level of evidence that PSA-based screening for prostate cancer may cause overtreatment associated with a moderate magnitude of associated harm.

According to this review, the U.S. PLCO (Prostate, Lung, Colorectal, and Ovarian) Cancer Screening Trial has shown no decrease in prostate cancer mortality from PSA screening, and the European Randomized Study of Screening for Prostate Cancer (ERSPC) showed a reduction in prostate cancer deaths per 1000 men screened in the 55-69 year-old subgroup. Based on reviewing the available evidence, the USPSTF states with moderate certainty that the benefits of PSA screening do not outweigh the associated risks/harms. Although the USPSTF had previously recommended against PSA-based screening for prostate cancer in men older than 75 years and felt there was insufficient evidence to make a recommendation for younger men, they now recommend against PSA-based screening for prostate cancer in all age groups. For more details regarding emerging commentary and where other medical organizations stand on this guideline, please refer to the clinical guideline document [3].

Mackenzie et al. reported in BMJ the results from a cohort study evaluating the risk of incident breast cancer in older women exposed to spironolactone [4]. Spironolactone possesses anti-androgenic and progestogenic properties, so there is concern that this drug may promote breast cancer development. The authors conducted a retrospective, matched cohort study using a primary care database from the United Kingdom. Over 1 million women aged 55 years or older without a personal history of breast cancer were analyzed. A patient was considered exposed to spironolactone if she received two prescriptions after the age of 55 years; two unexposed women were randomly selected to serve as controls for each exposure patient. The authors looked for a primary outcome of new breast cancer cases identified using Read codes. In the 28,032 women exposed to spironolactone and the 55,961 not exposed, there were a total of 29,491 new cases of breast cancer. The unadjusted rate of breast cancer in each group was 0.39% per year for women exposed to spironolactone versus 0.38% per year for those not exposed. As expected, there was a greater prevalence of women with comorbid diabetes and heart failure in the exposure cohort. Identified risk factors for breast cancer included family history of breast cancer (HR 3.87, 95% CI 2.91-5.14), history of other cancers (HR 1.64, 1.44-1.87), exposure to multiple drug classes (HR 1.04, 1.02-1.06), and exposure to steroids (HR 0.78, 0.65-0.92), most of which carried more weight than the hazard ratio of 0.99 (CI 0.87-1.12) calculated for exposure to spironolactone. Hence, there was no significant evidence to support the concern that spironolactone may increase the incidence of breast cancer in female patients. There also was no relationship between spironolactone dose and risk of developing breast cancer [4].

In wellness news, the Lancet published an early article online by Lee et al. describing the risks of physical inactivity on non-communicable morbid diseases such as coronary artery disease, diabetes, breast cancer, and colon cancer [5]. Prior studies have suggested that physical inactivity, similar to smoking and obesity, may qualify as a risk factor for these diseases and shorten life expectancy. Physical inactivity was defined as an activity level insufficient to meet present recommendations of 15-30 minutes of brisk walking each day. Lee et al. set out to quantify the effect of physical inactivity on the aforementioned major non-communicable diseases. They calculated Population Attributable Fractions (PAFs), an epidemiological measure used to estimate the effect of a risk factor on disease incidence in a population. Utilizing the latest World Health Organization data from 2008, with conservative calculations, the authors estimated that being inactive accounts for 6% of the burden for CAD, 7% for type 2 diabetes, 10% for breast cancer, 10% for colon cancer, and 9% overall for premature mortality. In population numbers, this means that physical inactivity accounted for 5.3 million of 57 million deaths worldwide in 2008. The authors go on to estimate that decreasing physical inactivity by 10% could avert more than 533,000 deaths a year; reducing physical inactivity even more by 25% could avert up to 1.3 million deaths each year. According to this paper, the life expectancy of the world population could be increased by 0.68 years if physical inactivity were eliminated. The greatest increase in energy expenditure was associated with a reduction in CAD; the authors estimate that 6% of the burden of this disease worldwide could be eliminated if all inactive persons became active. However, removal of inactivity had the largest effect on colon cancer, and the least on CAD [5]. One key takeaway message from this study is that recommending physical activity to our patients may delay the development of CAD, diabetes, breast cancer, and colon cancer, and reduce overall mortality.

In the Annals of Internal Medicine, Angell et al. report on the results of a 2006 New York City regulation restricting trans fats in restaurants [6]. They assessed the effect of this regulation on the intake of trans fats and saturated fats pre- and post-implementation. A research group funded by the City of New York and the Robert Wood Johnson Foundation Healthy Eating Research Program conducted brief restaurant lunchtime surveys at 168 NYC locations of 11 fast-food chains (McDonald’s, Burger King, Wendy’s, Subway, Au Bon Pain, Kentucky Fried Chicken, Popeye’s, Domino’s, Pizza Hut, Papa John’s, and Taco Bell). Based on survey responses, the authors tallied up 6969 individual purchases in 2007 and 7885 purchases in 2009. They found a mean trans fat per purchase reduction of 2.4 gms (95% CI –2.8 to –2.0, p <0.001), saturated fat per purchase increase of 0.6 gms (CI 0.1-1.0, p=0.011), mean trans plus saturated fat reduction of 1.9 gms (CI -2.5 to –1.9, p<0.001), and mean trans fat per 1000 kcal decrease of 2.7 gms (CI –3.1 to –2.3, p<0.001). The authors note that purchases with “zero” trans fat increased from 32% to 59%, a result unrelated to the poverty rate of the neighborhood. Products with up to 0.5gm of trans fat can be labeled as containing “zero” trans fat, so it is difficult to determine how many of these products truly contain no trans fat. The greatest absolute decrease of trans fat per purchase occurred in the hamburger chains, averaging 3.8gm less trans fat. Sandwich chains (i.e. Subway, Au Bon Pain) were found to have an increase in trans fat content per purchase by 0.1 gm (p<0.001). Despite an observed slight increase in saturated fat content, the total trans fat and trans plus saturated fat content decreased overall suggesting a net health benefit from the NYC regulation. The question now remains as to whether local food policy such as this one can have a broad positive effect if implemented nationally in the country’s food companies [7].

Other articles of note:

Mueller SK, Sponsler KC, Kripalani S, Schnipper JL. Hospital-Based Medication Reconciliation Practices: A Systematic review. Arch Intern Med. 2012;172(14):1-13.

A systematic review of 26 controlled studies looked at different inpatient medication reconciliation practices and how they might affect clinical outcomes. Evidence currently supports medication reconciliation systems that rely heavily on pharmacists and focus on patients who are at high risk for developing adverse events.

Esbjörnsson J, et al. Inhibition of HIV-1 Disease Progression by Contemporaneous HIV-2 Infection. N Engl J Med. 2012 July 19;367(3):224-232.

Patients who have dual infection with HIV-2 before HIV-1 infection have a delayed onset to developing AIDS as well as the highest levels of CD4+ T-cell counts. The authors believe that concomitant HIV-2 infection may inhibit HIV-1 disease progression.

Ahmed S, Li Q, Liu L, Tsui AO. Maternal deaths averted by contraceptive use: an analysis of 172 countries. The Lancet. 2012 July 14; 380(9837):111-125.

Database review estimated that although over 300,000 women died of maternal causes in 2008, contraceptive use averted approximately 270,000 maternal deaths. Because the numbers of unwanted pregnancies are still high in many developing countries, evidence now supports contraceptive use as an effective approach to preventing and reducing maternal mortality.

Fried MW, Navarro VJ, Afdhal N, et al. Effect of Silymarin (Milk Thistle) on Liver Disease in Patients with Chronic Hepatitis C Unsuccessfully Treated with Interferon Therapy: A Randomized Controlled Trial. JAMA. 2012; 308(3):274-282.

A multi-centered, double-blind, placebo-controlled trial in which participants with chronic hepatitis C viral infection who were previously unsuccessfully treated with interferon-based therapy were randomly assigned to receive low-dose silymarin, high-dose silymarin or placebo and followed over 24 weeks of treatment. Higher doses of silymarin were noted to not significantly reduce ALT levels more than placebo.

Shirani A, Zhao Y, Karim ME, et al. Association Between Use of Interferon Beta and Progression of Disability in Patients With Relapsing-Remitting Multiple Sclerosis. JAMA. 2012;308(3):247-256.

A retrospective cohort study in Canada which found that patients with relapsing-remitting MS did not see a statistically significant reduction in progression of disability from receiving interferon beta when compared to a contemporary control cohort.

Dr. Jennifer Lee Dong is a 3rd year resident at NYU Langone Medical Center

Peer reviewed by Lakshmi Tummala, Associate Editor, Clinical Correlations

Image courtesy of Wikimedia Commons


1. Pollack A. F.D.A. Approvs Qsymia, a Weight-Loss Drug. NY Times 17 July 2012.

2. Wilt TJ, Brawer MK, Jones KM, Barry MJ, Aronson WJ, et al. Radical Prostatectomy versus Observation for Localized Prostate Cancer. N Engl J Med. 2012 July 19;367(3):203-213.

3. Moyer VA. Screening for Prostate Cancer: U.S. U.S. Preventive Services Task Force Recommendation Statement. Ann Intern Med. 2012 July 17; 157(2): 120-134.

4. Mackenzie IS, MacDonald TM, Morant S, Wei L. Spironolactone and risk of incident breast cancer in women older than 55 years: restrospective, matched cohort study. BMJ 2012;345:e4447.

5. Lee IM, Shiroma EJ, Lobelo P, Puska P, Blair SN, Katzmarzyk PT. Effect of physical inactivity on major non-communicable diseases worldwide: an analysis of burden of disease and life expectancy. The Lancet, Early Online Publication, 18 July 2012. doi:10.1016/S0140-6736(12)61031-9.

6. Angell SY, Cobb LK, Curtis CJ, Konty KJ, Silver LD. Change in Trans Fatty Acid Content of Fast-Food Purchases Associated with New York City’s Restaurant Regulation: A Pre-Post Study. Ann Intern Med. 2012 July 17; 157(2):81

Primecuts – This Week In The Journals

July 16, 2012

By Mark Adelman, MD

Faculty Peer Reviewed

The U.S House of Representatives voted for the 33rd time last Wednesday to repeal the Affordable Care Act, a largely symbolic move as the prior 32 companion bills were never passed by the Senate.[1] With the realities of roughly 50 million uninsured Americans[2] and healthcare spending accounting for nearly 18% of the US GDP[3] in mind, let us review some recent publications that may improve both clinical care and runaway healthcare spending.

In an online-first publication by the Annals of Internal Medicine,[4] Smith reports on a curriculum designed by a collaboration between the Alliance of Academic Internal Medicine and the American College of Physicians, “to incorporate the principles of high-value, cost-conscious care into residency training.” Noting that medical education has historically lacked specific training in the stewardship of healthcare resources, this curriculum development committee created a series of lectures and small group activities designed to be incorporated into pre-existing resident education sessions (e.g. noon conferences and morning reports). The goal of such a curriculum is to get residents more accustomed to considering the “benefits, harms, and costs of a test or intervention as well as use of evidence-based, shared decision making.”

While some may argue that the role of the physician is to advocate for each individual patient, and so a consideration of cost should not factor into one’s clinical decision-making, I believe that cost-consciousness merely acknowledges the unavoidable fact that healthcare spending in the US is on an unsustainable trajectory. This AAIM-ACP collaboration should be commended for their recognition of a traditionally weak spot in medical education and a proposed solution that will hopefully contribute to improved care for patients nationwide in the future.

The most recent print edition of Annals includes a clinical practice guideline on packed red blood cell (PRBC) transfusion developed by the AABB (formerly known as the American Association of Blood Banks).[5] The guideline makes four major recommendations related to the use of PRBC transfusion in hospitalized, hemodynamically stable patients with anemia. Recommendation #1 is that a restrictive transfusion strategy, i.e. hemoglobin level <7 g/dL, be used for adult and pediatric ICU patients. Of note, this is the only recommendation deemed to have high-quality evidence and thus the strength of the recommendation is “strong.” Recommendation #2 also advocates for adhering to a restrictive transfusion strategy for patients with pre-existing cardiovascular disease, with a transfusion threshold of 8 g/dL in the presence of symptoms (defined as chest pain, orthostatic hypotension or tachycardia unresponsive to fluid resuscitation, or congestive heart failure). The quality of the evidence for this recommendation was moderate, making the strength of the recommendation “weak.” The AABB was unable to recommend a liberal or restrictive transfusion strategy for patients with acute coronary syndrome (recommendation #3), noting that the systematic review that preceded the development of the guideline did not identify any randomized, controlled trials (RCTs) that address this question. Recommendation #4 is that transfusion decisions be influenced by patient symptoms and not just hemoglobin concentrations. However, this is another weak recommendation based on low quality evidence, with only one identified RCT that incorporated symptoms into the decision to transfuse. Therefore, the recommendation appears to be based more on expert opinion and “conventional wisdom, based on physiologic reasoning….” The guideline authors state that if restrictive transfusion strategies were widely implemented, exposure of patients to PRBC transfusions would decrease by an average of approximately 40%. This seems like a worthy goal given the risk of complications associated with blood transfusions and the substantial resources consumed by the estimated 15 million yearly PRBC transfusions in the US.

Most women who suffer from urinary tract infections (UTIs) have likely been advised to try cranberry juice as a treatment or prophylactic measure at some point in their lives, whether by a physician, friend or family member. Wang and colleagues conducted a systematic review and meta-analysis, published in the Archives of Internal Medicine,[6] examining the evidence for the use of cranberry-containing products for the prevention of UTIs. The authors note that UTIs are a common diagnosis, with an estimated 7 million office visits, 1 million ER visits and 100,000 hospitalizations yearly in the US, with a total cost of $1.6 billion. Analyzing 10 studies with a total of nearly 1500 subjects, the authors calculated an overall relative risk of UTI in cranberry users vs. non-cranberry users of 0.62 (95% CI, 0.49-0.80). However, the studies were fairly heterogeneous (I2=43%), with much variability in populations studied, method and dose of cranberry administration and even definition of UTI. This means that the results should be interpreted with caution and further studies are warranted. However, given that cranberry consumption is a very low risk and inexpensive intervention, it seems reasonable to recommend its use for the prevention of UTIs with the caveat that its effectiveness is by no means entirely proven.

Much has been published on the possible interaction between Plavix and proton pump inhibitors (PPIs), with conflicting evidence as to whether there is a clinically relevant increase in cardiovascular events that can be attributed to a pharmacokinetic interaction between these two agents that are both metabolized by CYP2C19. A group of UK researchers led by Douglas conducted an observational study published online by BMJ that attempted to address this issue.[7] Their publication consisted of two studies with different designs; the first was a traditional cohort study but the second was a self controlled case series. The authors explain that a self controlled case series is essentially a case-control study that uses each case as its own control. In this study, incidence rate ratios for the primary outcome of composite all-cause mortality or incident MI were calculated by comparing event rates during periods of time that patients were taking both Plavix and a PPI(case period) to periods of time in which they were taking Plavix but not a PPI (control period).

In the cohort study, the adjusted hazard ratio was 1.37 (95%CI 1.27-1.48), suggesting that there is an association between Plavix-PPI co-administration and mortality and/or MI. However, in the self controlled case series there was no significant difference in the adjusted incidence rate ratio between groups (IRR 0.75, 95%CI 0.55-1.01). The authors argue that the association present in their cohort study that disappeared in the self controlled case series suggests that there is not a causal relationship between Plavix-PPI co-administration and death/MI. It is possible that patients who take Plavix and PPIs are somehow different than patients who take only Plavix and it is this difference that drives the increased risk of death/MI.

Another study related to the treatment of coronary artery disease, this time the surgical management of CAD, was published in JAMA.[8] The RED-CABG study was a RCT designed to assess the efficacy and safety of acadesine administered perioperatively to moderate-to-high risk CABG patients in reducing a composite outcome of all-cause mortality, nonfatal stroke, and severe left ventricular dysfunction (SLVD) through 28 days. Acadesine is an adenosine-regulating agent, which increases the availability of adenosine to myocardial cells. Endogenous adenosine decreases the severity of ischemia-reperfusion injury, which forms the physiologic underpinning for the use of an agent like acadesine in the peri-CABG setting. RED-CABG was projected to enroll 7500 participants, however the study was halted after enrollment of about 3000 patients when an interim analysis showed no significant difference in the primary outcome incidence rate between the intervention and placebo groups (OR, 1.01 [95% CI, 0.73-1.41]; P = .94), which failed to meet a pre-specified futility threshold. This lack of benefit is in contrast to the findings of a 1997 meta-analysis of previous trials of acadesine. The authors note that the incidence of the primary outcome (~5% in both groups) was lower than expected and may reflect more recent advances in the management of CABG patients, including methods for myocardial protection other than pharmacologic additives.

This week’s selection of articles cover a broad range of topics, dealing with clinical issues that are common in the US and thus practical areas for improving outcomes, minimizing harms and providing care in a cost-conscious way.

Other articles of note:

Ahmed S, Li Q, Liu L, Tsui AO et al. Maternal deaths averted by contraceptive use: an analysis of 172 countries. Lancet. 2012;380(9837):111-125.

Statistical modeling estimated that worldwide maternal mortality in 2008 was 342,203 deaths. An estimated 272,040 maternal deaths were prevented by contraceptive use, and satisfying the unmet need for contraception could prevent an additional 104,000 maternal deaths every year.

Aubin HJ, Farley A, Lycett D, Lahmek P, Aveyard P. Weight gain in smokers after quitting cigarettes: meta-analysis. BMJ. 2012 Jul 10;345:e4439.

Meta-analysis of 62 studies that recorded weight change from baseline in abstinent smokers. Mean weight gain was 4-5kg 12 months after smoking cessation, with most of the gain occurring within three months of quitting. 16% of subjects lost weight, and 13% gained more than 10kg.

Flaherty KT, Robert C, Hersey P et al. Improved survival with MEK inhibition in BRAF-mutated melanoma. N Engl J Med. 2012 Jul 12;367(2):107-14.

Open-label phase 3 RCT comparing trametinib, an oral selective BRAF inhibitor, to cytotoxic chemotherapy (dacarbazine or paclitaxel) in patients with V600E or V600K BRAF-mutated metastatic melanoma. Median progression-free survival was 4.8 months in the trametinib group and 1.5 months in the chemotherapy group; 6 month overall survival was 81% in the trametinib group and 67% in the chemotherapy group.

Greenberger NJ, Sharma P. Update in Gastroenterology and Hepatology: Evidence Published in 2011. Ann Intern Med. 3 July 2012;157(1):44-48.

Summary of studies published in 2011 relevant to practice of gastroenterology and hepatology. Topics include esophageal disorders, PPIs, hyperemesis, mast cell disorders, treatment of C. diff infections, and pancreatitis and hepatitis.

Dr. Mark Adelman is a 2nd year resident at NYU Langone Medical Center

Peer reviewed by Robert Gianotti, MD, Associate Editor, Clinical Correlations

Image courtesy of Wikimedia Commons


1. Caldwell LA. House passes health care repeal, again. CBS News. July 11, 2012.

2. US Census Bureau. Income, Poverty and Health Insurance Coverage in the United States: 2009. September 16, 2010.

3. Centers for Medicare and Medicaid Services. National Health Expenditure Data.

4. Smith CD. Teaching High-Value, Cost-Conscious Care to Residents: The Alliance for Academic Internal Medicine-American College of Physicians Curriculum. Ann Intern Med. 2012 Jul 10:E-496.

5. Carson JL, Grossman BJ, Kleinman S et al. Red Blood Cell Transfusion: A Clinical Practice Guideline From the AABB. Ann Intern Med. 2012;157(1):49-58.

6. Wang CH, Fang CC, Chen NC et al. Cranberry-Containing Products for Prevention of Urinary Tract Infections in Susceptible Populations: A Systematic Review and Meta-analysis of Randomized Controlled Trials. Arch Intern Med. 2012;172(13):988-96.

7. Douglas IJ, Evans SJW, Hingorani AD et al. Clopidogrel and interaction with proton pump inhibitors: comparison between cohort and within person study designs. BMJ. 2012;345:e4388.

8. Newman MF, Ferguson TB, White JA et al. Effect of Adenosine-Regulating Agent Acadesine on Morbidity and Mortality Associated With Coronary Artery Bypass Grafting: The RED-CABG Randomized Controlled Trial. JAMA. 2012;308(2):157-164.

Primecuts – This Week In The Journals

July 9, 2012

Jenny Gartshteyn, MD

Faculty Peer Reviewed

It’s the week of July 4th 2012, people gather on the boardwalks and rooftops, fireworks light up the sky – silent tribute to how far art and science have advanced since the first discovery of fireworks in China in the 10th century. Paralleling the advancement in pyrotechnics from year to year is the growing body of medical research and knowledge. In this spirit of progress, let’s review some of the more exciting medical findings from this week.

In this week’s issue of The Lancet, a new once-daily combination pill for initial treatment of HIV was evaluated in a phase 3, non-inferiority trial. [1] Today, the preferred initial therapy for HIV consists of two nucleoside/nucleotide reverse transcriptase inhibitors (generally emtricitabine and tenofovir) combined with a third agent (a non-nucleoside reverse transcriptase inhibitor vs a protease or integrase inhibitor). This controlled, randomized, double-blinded phase 3 trial compared the gold-standard, emtricitabine(FTC)/tenofovir(TDF)/efavirenz(EFV) – sold under the brand name Atripla – with an experimental regimen of emtricitabine (FTC)/tenofovir(TDF)/ elvitegravir(EVG, new integrase inhibitor) and an investigational CYP3A4 inhibitor cobicistat (COBI). 707 North American patients with non-resistant HIV-1 (and >5000 RNA copies/mL) that have never before been treated, with preserved eGFR>70ml/min, were randomized to receive one of the two medication regimens with the primary endpoint of viral suppression (HIV RNA<50 copies/mL) at week 48. Intention to treat analysis was performed with a pre-specified inferiority margin of 12%, the primary endpoint was achieved in 84.1% of patients in the atripla group and 87.6% in the experimental group, with a 3.6% difference (95% CI -1.6 to 8.8%). The virologic response was seen earlier in the experimental group and persisted up to week 16, with no significant difference seen after week 16. 2% of patients developed resistance to the antiretroviral regimen in each of the two groups. Meanwhile, discontinuation rates were similar (12% in experimental group and 15% in the control group) with side-effects of nausea being more common in the experimental drug group (23% vs 19%), whereas side-effects dizziness, headace and rash were more common in the control group (likely 2/2 efavirenz). This study suggests that there may soon be a new one-tablet regimen for the initial treatment of HIV infection in patients with non-resistant HIV and a preserved renal function.

On the topic of new, experimental medications the New England Journal of Medicine this week published a phase 2 study by Olnes et. al. evaluating the effect of a thrombopoietin mimetic (eltrombopag, or Promacta) in patients with aplastic anemia. [2] Twenty five consecutive adults with aplastic anemia refractory to immunosuppresion with anti-thymocyte globulin (ATG)/cyclosporin were recruited in a non-blinded study to receive a 12 week course of daily eltrombopag with primary end-points of response in one or more lineages at 12 weeks (pre-defined increase of either +20,000 platelets, +1.5 g/dl hemoglobin (hgb), or +500 absolute neutrophil count (ANC) , or a reduction in the required number of transfusions for Hgb of 9 g/dl or platelets<10,000). Eleven of the 25 patients met the primary response criteria (9 pts had a platelet response, 6 had an erythroid response, 9 eventually had a neutrophil response). Seven of 11 patients with a response continued to receive eltrombopag (150mg daily), and 6 patients eventually met response criteria in three lineages. This study is important because for most patients with refractory aplastic anemia, stem-cell transplantation remains the only hope – and if a suitable donor is not found, more than 40% will die from bleeding or infection within 5 yrs after diagnosis.[2] Thrombopoietin regulates platelet production via the c-MPL receptor on megakaryocytes, however hematopoietic stem cells and progenitor cells also express c-MPL and thrombopoietin modulated activation of these receptors may open up new therapeutic opportunities in the treatment of aplastic anemia.

Switching gears, a Nature Medicine article by Breen et al proposes a new physiological explanation for the rapid glucose-lowering effects of bariatric surgery in type 2 diabetes. [3] This group suggests that nutrient delivery to the small bowel activates a neurocircuit via the brain that ultimately innervates the liver to reduce its glucose production, thus improving hyperglycemia in diabetes. Direct intrajejunal glucose and lipid administration in rats increased overall glucose metabolism and decreased endogenous glucose production; the same effect was not seen with an equal amount of glucose infused directly into the hepatic portal vein. A diabetic model in rats (by administering beta-cell toxin streptozotocin), was used in a 4×4 design with control and diabetes induced rats randomized to duodenal-jejunal bypass surgery (DJB) or sham surgery – the group found that as early as day 2 DJB-operated diabetic rats had reduced plasma glucose levels as compared to sham-treated diabetic rats or control group rats. This finding was not associated with any changes in weight or food intake. By 14 days after the surgery, plasma glucose levels of DJB-operated rats remained normal (with no significant difference from the non-diabetic control groups), whereas glucose concentrations in the sham-operated diabetic rats remained at presurgical glucose values (~400mg/dL). Although still controversial and of unknown duration of benefit, this early work may pave a road for bariatric surgery in more than just weight loss. Specifically it may have its own niche in the treatment of both type I and type II DM.

Another common problem of the aging population is osteoporosis, and the optimal treatment against fractures is still encased in some mystery. A NEJM meta-analysis by Bischoff-Ferrari et al evaluated the effect of vitamin D supplementation based on pooled data from 11 double-blind, randomized, controlled trials involving 31,022 persons 65 years or older that received oral vitamin D supplementation, alone or in combination with calcium, as compared to with a control (placebo or calcium alone). [4] Primary end points were risk of hip fracture and any nonvertebral fracture. Although the intention-to -treat analysis showed a nonsignificant reduction of 10% in hip and 7% in any non-vertebral fracture, an alternate primary analysis (which looked at the actual intake of vitamin D in treated participants vs controls, with adjustment for adherence) found a 30% reduction in hip fractures (hazard ratio 0.70; 95%CI .58-.87) and a 14% reduction in any non-vertebral fracture (hazard ratio 0.86; 95%CI 0.76-0.96) at vitamin D intake levels of 792-2000 IU per day. Notably, there was no reduction in fractures at any vitamin D intake levels lower than 792 IU per day. A limitation to the findings was that any significant effect of vitamin D independent of calcium could not be assessed because all trials that gave higher doses of vitamin D (>800 IU/day) also gave calcium. In summary, it appears that to be effective – vitamin D needs to be given in doses of at least 800 IU per day, with the exact therapeutic dose and effect of calcium supplementation still unknown.

Another popular treatment for osteoporosis are the bisphosphonates, whose safety has recently been brought into question as well. A case-control study from the Archives of Internal Medicine by Meier et al. evaluated the correlation of recent rise in new, atypical femoral fractures (unlike the classic osteoporosis-related femoral fractures) with the use of bisphosphonates. [5] Retrospective analysis of 477 patients aged 50 years or older, that initially presented with either a femoral or a subtrochanteric fracture not related to trauma, tumor or other conditions that affect bone integrity, were radiographically evaluated (by blinded analysis) to have either atypical fracture lines (cases) or classic fracture lines (controls). Of these 477 patients with fractures, 39 were found to have atypical fractures as compared to the 438 with classic fractures. Among the patients with atypical fractures, 82.1% were taking a bisphosphonate, as compared to the 6.4% in the controls (Odds Ratio, adjusted for vit D, corticosteroids, PPI, sex and age, was 69.1; 95%CI 22.8-209.5). The correlation of atypical fractures and bisphosphonate use increased with increasing durations of bisphosphonate use, OR 46.9 at 2-5 years of use (95% CI, 14.2154.4), OR 117 at 5-9 years of use (95% CI, 34.2-401.7), and OR of 175.7 at >9yrs of use (95% CI, 30.0-1027.6). This study implies a strong correlation between prolonged bisphosphonate use and atypical fracture risk. However, given the small incidence of atypical fractures as compared to classic fractures – the benefits of bisphosphonate use are still likely to outweigh the risks, although shortening duration of treatment may be warranted given the long half life of bisphosphonates and the above mentioned risk with extended use.

So as the smoke of the fireworks clears, we are left with several more medical advancement and with a lot more unsolved puzzles on the horizon. Stay tuned for next week’s prime-cuts!

Dr. Jenny Gartshteyn is a 2nd year resident at NYU Langone Medical Center

Peer reviewed by Neil Shapiro, MD, Editor-In-Chief, Clinical Correlations

Image courtesy of Wikimedia Commons


1. Paul E Sax, Edwin DeJesus, Anthony Mills, et al. Co-formulated elvitegravir, cobicistat, emtricitabine, and tenofovir versus co-formulated efavirenz, emtricitabine, and tenofovir for initial treatment of HIV-1 infection: a randomised, double-blind, phase 3 trial, analysis of results after 48 weeks. The Lancet 2012; 379:2439-2448

2. Matthew J Olnes, Phillip Scheinberg, et al. Eltrombopag and improved hematopoiesis in refractory aplastic anemia. New England Journal of Medicine 2012; 367(1):11-19

3. Danna Breen, Brittany Rasmussen, et al. Jejunal nutrient sensing is required for duodenal-jejunal bypass surgery to rapidly lower glucose concentrations in uncontrolled diabetes. Nature Medicine 2012; 18(6):950-956

4. Heike A. Bischoff-Ferrari, Walter C Willett, et al. A pooled analysis of vitamin D dose requirements for fracture prevention. New England Journal of Medicine 2012; 367(1):40-49

5. Raphael P.H. Meier, Thomas V Perneger, et al. Increased occurence of atypical femoral fractures associated with bisphosphonae use. Archives Internal Medicine 2012; 172(12):930-936

Other Noteworthy Publications:

1. Lesley A Inker et al. Estimating Glomerular Filtration Rate from Serum Creatinine and Cystatin C. NEJM 2012; 367(1):20-29. As a substitute for serum creatinine as a measurement of eGFR, GFR calculations based on cystatin C levels may be more accurate.

2. Anjan Debnath et al. A high-throughput drug screen for Entamoeba histolytica identifies a new lead and target. Nature Medicine 2012; 18:956960 Anjan Debnath and colleagues now report their identification of auranofin, an approved treatment for rheumatoid arthritis, as a candidate new drug for combating E. histolytica infection.

3. Vanessa D Alphonse et al. Mechanisms of Eye Injuries From Fireworks. JAMA 2012; 308(1):33-34. Cadaver eyes were procured within 55 days of death and exposed to simulated fireworks with the only observed injuries being corneal abrasions caused by projected material.

Primecuts-This Week in the Journals

July 2, 2012

Alexander Volodarskiy, MD

With every new summer come hotter days, shorter nights, and the inevitable medical metamorphosis on July 1st: senior residents finally graduating and nervous, bright-eyed, bushy-tailed new interns, hearts in hand, arriving to save lives. With them in mind, we jump into this week’s section of Prime Cuts.

In this week’s New England Journal of Medicine, Kang et al. looked at whether patients with left-sided infective endocarditis could benefit from early surgery.[1,2] The study excluded patients with small vegetations (≤10 mm), right-sided vegetations, prosthetic valves, heart block, and moderate-to-severe congestive heart failure. Seventy-six patients in South Korea were randomized to either an early surgical group (37 patients scheduled to undergo surgery within 48 hours) or a conventional treatment group (39 patients). In the end, all of the patients in the early surgical group and 30 of 39 patients in the conventional group underwent surgery, but the results were markedly different. Only 3% of the early surgery group reached the primary end point of death or embolic event compared to 25% in the conventional group. This was largely driven by the large number of embolic events in the conventional group; the mortality difference was not statistically significant between the two groups. The findings of this study underscore the importance of early surgery for reducing morbidity of left-sided native valve endocarditis.

While endocarditis is a risk factor for cerebral embolic phenomena, it is far from the only factor. A JAMA study by Xian et al. looked at the risk of intracranial hemorrhage in patients receiving warfarin who are also given intravenous tissue plasminogen activator (TPA) for acute ischemic stroke.[3] They used data from the American Heart Association “Get with the Guidelines” Registry to identify 1,802 patients with pre-admission warfarin use and INR less than 1.7 who were treated with TPA and compared them to 21,635 controls that received TPA but were not on warfarin. Patients in the warfarin group were older (74.1 vs. 69.5 years old), had more comorbidities (atrial fibrillation, prosthetic valves, prior strokes or transient ischemic attacks, and peripheral vascular disease) and had more severe strokes. When evaluated for the primary outcome of symptomatic intracranial hemorrhage, the warfarin group had seemingly worse outcomes with a crude odds ratio of 1.22, although the 95% CI (0.99-1.51) included the null value of 1.0. However, when the groups were adjusted for the comorbidities, the two groups fared fairly equally with an adjusted odds ratio of only 1.01 (0.82-1.25). There results imply that TPA should be considered in all patients who meet criteria, regardless of their warfarin exposure, as long as their INR is less than 1.7.

A related article in The Lancet by Zinkstok and Roos looked at whether early administration of aspirin in patients treated with TPA may decrease the risk of re-occlusion after TPA-induced recanalization. [4,5] The study recruited 642 patients in the Netherlands who were randomized in an open label manner (but with blinded end point assessment) to an early aspirin group receiving 300 mg of IV aspirin within 90 minutes of TPA (322 patients) or a standard group receiving standard of care anti-platelet therapy 24 hours later. The primary end point was a favorable outcome at 3 months defined as a score of 0-2 on a modified Rankin scale, which is commonly used for measuring the degree of disability. A score of 0 means the patient has no symptoms, while a score of 2 indicates slight disability but the patient is still able to perform all ADLs and IADLs independently.

Unfortunately, the trial was terminated prematurely because of an excess of symptomatic intracranial hemorrhage and no evidence of benefit in the aspirin group. At 3 months there were 78 patients lost to follow up and 174 (54%) patients in the early aspirin group versus 183 (57%) patients in the standard treatment group had a favorable outcome, a result that wasn’t statistically significant. On the other hand, the early aspirin group had 14 (4.3%) episodes of symptomatic intracranial hemorrhage compared to 5 (1.6%) in the standard treatment group, a significant difference and the biggest reason for poor outcomes in the early aspirin group. These results suggest that we should continue to hold off for 24 hours before giving stroke patients aspirin after TPA administration.

While strokes are a significant source of morbidity and mortality in the elderly, Perissinotto et al. looked at another risk factor we don’t always consider – loneliness.[6] The investigators followed a cohort of 1,604 mostly Caucasian adults with a mean age of 71 years with a baseline and follow-up assessments every 2 years. At each visit, the investigators asked the study participants if they felt left out, isolated, or lacked companionship. Participants were then categorized as “not lonely” if they responded hardly ever to all 3 questions and “lonely” if they responded some of the time or often to any of the 3 questions. The investigators then looked at the primary outcome of time to death over 6 years and functional decline over 6 years on the following 4 measures:

• difficulty in an increased number of ADLs

• difficulty in an increased number of upper extremity tasks

• decline in mobility

• increased difficulty in stair climbing

The study found that while 18% of the participants lived alone, 43% reported feeling lonely. In fact, the majority of lonely persons lived with someone. And while lonely subjects were more likely to be depressed (37.5 v. 10.8%), most lonely subjects were not depressed (62.5%). The investigators found that over a 6-year follow-up period, loneliness was associated with an increased risk of death (22.8% v. 14.2%). Even when adjusted for confounders, including illness severity and depression, the adjusted hazard ratio was 1.45. Similarly, there was a decline in ability to perform ADLs (24.8% v. 12.5%; adjusted risk ratio of 1.59), difficulties with upper extremity tasks (41.5% v. 28.3%; adjusted risk ratio of 1.28), decline in mobility (38.1% v. 29.4%; adjusted risk ratio of 1.18) and difficulty in climbing stairs (40.8% v. 27.9%; adjusted risk ratio of 1.31). Although the mechanisms outlining the association between loneliness and health outcomes cannot be elucidated from an observation study such as this one, this study underscores the importance of recognizing it as a substantial risk in the elderly.

So, as we plunge into the next academic year, please remember, elderly or not, no patient or colleague should be left alone and if you see someone struggling, please lend a helping hand.

Alexander Volodarskiy, MD is a 3rd year resident at NYU Langone Medical Center

Peer Reviewed by Ishmeal Bradley, MD Section Editor, Clinical Correlations


1. Kang D-H, Kim Y-J, Kim S-H, et al. Early Surgery versus Conventional Treatment for Infective Endocarditis. New England Journal of Medicine 2012;366:2466-73.

2. Gordon SM, Pettersson GB. Native-Valve Infective Endocarditis — When Does It Require Surgery? New England Journal of Medicine 2012;366:2519-21.

3. Xian Y LLSEE, et al. Risks of intracranial hemorrhage among patients with acute ischemic stroke receiving warfarin and treated with intravenous tissue plasminogen activator. JAMA: The Journal of the American Medical Association 2012;307:2600-8.

4. Zinkstok SM, Roos YB. Early administration of aspirin in patients treated with alteplase for acute ischaemic stroke: a randomised controlled trial. The Lancet 2012.

5. Parsons MW, Levi CR. Reperfusion trials for acute ischaemic stroke. The Lancet 2012.

6. Perissinotto Cm SCICKE. Loneliness in older persons: A predictor of functional decline and death. Archives of internal medicine 2012:1-7.

Primecuts – This Week In The Journals

June 26, 2012

By Lakshmi Tummala, MD and Todd Cutler, MD

In the spirit of prevention, please wear hats and apply a generous amount of sunscreen while reading this week’s issue of primecuts.

This week the United States Preventive Services Task Force released updated guidelines for cervical cancer screening [1] recommending that women aged 21 to 65 years old, regardless of sexual history, be screened with a PAP smear every three years. For women who wish to lengthen the intervals between screening periods, beginning at age 30, women have the option of undergoing cytology and HPV testing every five years. Both of these recommendations are class A based on evidence showing decreased incidence or mortality of cervical cancer with screening.

Importantly, these recommendations do not apply to women with a history of exposure to diethylstilbestrol, prior cervical cancer or high-grade precancerous lesion, or any form of immunodeficiency. Furthermore, there is no role for screening with HPV testing before 30 years of age, screening at all before 21 years or after 65 years of age (the latter assuming adequate screening to that point) or for screening women after hysterectomy who no longer have a cervix (all grade D).

In comparison to the previous guidelines from 2003, this update acknowledges the higher rate of false positives with HPV testing and the associated risks. It also clearly suggests a time to initiate screening based on patterns of HPV infection with the intent to reduce harm, as management of a cervical lesion does have potentially adverse health consequences for women, especially in pregnancy. Lastly, the authors note that the change from annual screening to every three years may translate into decreased contact between young women and their healthcare providers, and that other routine preventive healthcare measures should not be compromised as a result. We wait to see how these new recommendations will change our clinic practice.

Traditionally, patients with alcoholic cirrhosis have been considered to be at significantly higher risk of developing hepatocellular carcinoma (HCC) than the general population and routine imaging is generally recommended for screening purposes [2] although the US Preventive Services Task Force has no recommendation regarding HCC screening in alcoholic cirrhosis. Both the American Association for the Study of Liver Diseases and the European Association for the Study of the Liver recommend that patients with alcoholic cirrhosis suggest that patients be screened with ultrasound or CT scan every six months [3].

Despite this, however, there have been no randomized clinical trials evaluating the impact of regular screening for HCC on patients with alcoholic cirrhosis. This is, in part, because of widely disparate estimates of the overall incidence of HCC in patients with alcoholic cirrhosis which, in different studies, have ranged from 1% to 16% over five years [4,5].

A publication out of Denmark this week investigated the incidence of HCC in patients with alcoholic cirrhosis [6]. A national registry was used to evaluate all patients identified in the health care system as having alcoholic cirrhosis without underlying viral hepatitis. Since, during this time, no national screening guidelines existed for alcoholic cirrhosis, these patients were monitored over a period without standardized surveillance. The primary outcomes were incidence of HCC and overall mortality.

Of 8,482 patients evaluated, 169 (2.0%) were diagnosed with HCC – a number that increased slightly over time. The risk at five years was to be about 1%. Survival after diagnosis was short with a median time of 97 days for localized HCC and 37 days for nonlocalized HCC. Of the same group of patients, 5,734 (67.6%) died during the study period and 43.7% died within five years of diagnosis. Only about 1.8% of the total deaths in the population were attributable to HCC.

While not an insignificant percentage, the authors suggest that it is unlikely that any national surveillance program will have any significant effect on mortality in patients with alcoholic cirrhosis. This is based on the believe that an annual risk of greater than 1.5% is needed for a mortality benefit to be gained for screening for HCC. The authors argue that this study is likely generalizable to other populations of alcohol cirrhosis and may inform future guidelines regarding screening for HCC in patients with alcoholic cirrhosis [7].

While a role for zidovudine in preventing perinatal transmission of HIV has been firmly established, this week, the NEJM published a study looking at evidence for improved efficacy with combination anti-retroviral therapy [8] – a trend already in practice for some high-risk populations.

Starting in 2004, subjects were recruited primarily from Brazil and South Africa; mothers and infants were excluded if they had received any ARVs besides zidovudine. 1745 infants were enrolled within 48 hours after birth to one of three arms: zidovudine alone, zidovudine plus nevirapine, or zidovudine plus nelfinavir and lamivudine. They were tested for HIV viral loads at birth and then again at multiple intervals with a primary end point of intrapartum HIV infection at three months. Infants found to be infected in utero (positive test at birth) were excluded from study analysis; those found to have an intrapartum infection (positive test after previous negative test) were subsequently withdrawn from the study in order to initiate HIV treatment. Breast-feeding was discouraged, with less than one percent of moms doing so by two weeks post-partum.

There was no difference in the rates of in utero transmission of HIV among the three groups. At three months, infants who received zidovudine alone had a higher rate of intrapartum transmission (4.8%) than infants who received two drugs (2.2%) or three drugs (2.4%). Overall transmission of HIV at almost nine months of follow-up showed a similar trend, with 11.0% of the infants in the zidovudine-alone group carrying the diagnosis, versus 7.1% in the zidovudine/nevirapine group and 7.4% in the zidovudine/nelfinavir/lamivudine group. There was no difference in mortality rates among the three groups. Infants who received three anti-retrovirals had significantly higher rates of neutropenia than the one- or two-drug infant groups (this neutropenia, along with anemia, comprised most of the documented adverse events).

Overall rates of transmission were lower than initially projected, but the differences between groups are still impressive and statistically significant. By self-report, these patients had an impressive 96% adherence rate, which may be higher than can be realistically expected in future applications. On analysis, there were no significant differences in resistance patterns between the groups, also note that 9% of the mothers were found to be co-infected with syphilis, though the impact of this on HIV infectivity was not explored by the authors. In conclusion, it seems that with the simple addition of one drug, we have the potential to really decrease the incidence of intrapartum HIV infection without the cost of significantly increased adverse health effects.

Lakshmi Tummala, MD  is a Chief Resident at NYU Langone Medical Center

Dr. Todd Cutler is also a Chief Resident at NYU Langone Medical Center and an associate editor for Clinical Correlations

Image Coutesy of Wikimedia Commons


[1] Moyer et al. Screening for cervical cancer: U.S. Preventive Services Task Force Recommendation Statement. Ann Intern Med. 2012 Jun 19;156(12):880-91. Available from:

[2] Sherman M. Hepatocellular carcinoma: screening and staging. Clin Liver Dis. 2011;15:323-34, vii-x. Available from:

[3] Bruix et al. Management of hepatocellular carcinoma: an update. Hepatology. 2011 Mar;53(3):1020-2. Available from:

[4] Toshikuni et al. Comparison of outcomes between patients with alcoholic cirrhosis and those with hepatitis C virus-related cirrhosis. J Gastroenterol Hepatol. 2009 Jul;24(7):1276-83. Available from:

[5] Fleming et al. All-cause mortality in people with cirrhosis compared with the general population: a population-based cohort study. Liver Int. 2012 Jan;32(1):79-84. doi: 10.1111/j.1478-3231.2011.02517.x. Available from:

[6] Jepsen et al. Risk for hepatocellular carcinoma in patients with alcoholic cirrhosis: A Danish nationwide cohort study. Ann Intern Med. 2012 Jun 19;156(12):841-7. Available from:

[7] Amy Norton. Study doubts value of liver cancer screening [Internet]. Reuters. 06/19/12. Available from:

[8] Nielsen-Saines et al. Three postpartum antiretroviral regimens to prevent intrapartum HIV infection. N Engl J Med. 2012 Jun 21;366(25):2368-79. Available from:

Primecuts – This Week In The Journals

June 12, 2012

By Sagar S. Mungekar, MD

Faculty Peer Reviewed

President Obama’s Affordable Care Act was in the news again this week, with proponents and opponents both voicing their opinions on how the Supreme Court should rule later this month. In March the Court considered the notion of severability; that is, whether certain provisions of the law could stand without the individual mandate or whether the entire law would have to go. The core provision of the act—the so-called individual mandate—would require nearly everyone to possess insurance, either from an employer or from a direct purchase. The decision of which and how much insurance coverage to purchase may be driven by patients’ own conceptions or predictions of how sick they are and whether they would be high- or low-utilizers of the medical system.

We saw several risk-stratification tools in the journals this week, many of which sought to assign patients a number or score to denote a better or worse prognosis. In this PrimeCuts, we will briefly review these prediction tools, as well as the study designs used to develop them.

The first one is a cohort study of heart failure (HF) patients.[1] Indeed, decompensated HF patients are hospitalized often, and during the initial ER presentation, the decision to admit or discharge is often difficult. The authors sought to derive a model to predict acute mortality in HF patients. The data were abstracted from medical charts of some 12,591 Canadian patients who presented to the ER with symptoms of heart failure exacerbation. They excluded those who arrived to the ER as transfers from other hospitals, those with DNR orders, and those who were dependent on dialysis. The patients were divided into derivation and validation cohorts, with approximately two-thirds hospitalized and one-third discharged in each. Using multivariate regression analysis, the authors identified ten variables, weighted each one, and developed the Emergency Heart Failure Mortality Risk Grade (EHMRG), a score centered around 0. Both cohorts behaved similarly with each 20-point rise in EHMRG causing the odds of 7-day death to increase by 41% (OR 1.41; 95% CI:1.34-1.48) and 39% (OR 1.39; 95% CI:1.32-1.47) in the derivation and validation groups respectively.

This sort of cohort study is beneficial at looking at all-comers, as older studies excluded those discharged directly from the ER. However, several factors should be noted before using the EHMRG at one’s own institution. The algorithm adds 60 points for those transported by EMS, regardless of whether such emergent transportation was necessary. Patients in urban centers with higher EMS utilization may have falsely elevated scores. Another component of the score is the absolute creatinine concentration, not the change from the patient’s baseline. While the predictive value of this variable was validated, it should be noted that the cohorts had mean creatinine concentrations of ~1.4 +/- 0.8 mg/dL. This relatively narrow distribution is possibly reflective of patient homogeneity, making medical centers with racially diverse populations weigh this variable differently.

With that last thought in mind, the next article in the same issue caught my eye: “Estimating Equations for Glomerular Filtration Rate in the Era of Creatinine Standardization”.[2] These authors performed a systematic review of studies comparing creatinine-based GFR equations to a reference—again a question of whether a single algorithm could be used to determine the GFR across different populations. Twenty studies were included in this analysis. Neither the Chronic Kidney Disease Epidemiology Collaboration (CKD- EPI) nor the Modification of Diet in Renal Disease (MDRD) study equations worked well for all patients. The former performed better at higher GFRs and the latter at lower. One should note that, in general, these equations were fitted for populations in North America, Europe, and Australia but required modifications for Asian and African populations.

From my own experiences, I have noticed that these calculations can vary significantly: while caring for a patient recently, three members of our team used three different equations and calculated three widely ranging estimates of the GFR, varying by almost a factor of two. Determining the GFR is clearly important when dosing medications and risk-stratifying, but in hospitals with diverse populations, the estimate provided by clinical laboratories as part of the basic metabolic panel may be inaccurate. Physicians may have to “personalize” these equations to estimate the GFR and document the method to maintain consistency.

The idea of tailoring calculations by patient demographics is clearly not a new one. If we flip to the British Medical Journal, we see a new adjustment validation for an old test: the concentration of D-dimers to rule out deep vein thrombosis (DVT).[3] The authors performed a retrospective, cross-sectional, diagnostic analysis to determine whether the conventional cutoff of 500 μg/L could be raised for elderly populations. The data came from just two accuracy studies, but included 1374 patients, comparing initial D-dimer values to evidence of DVTs by ultrasonography. The age-dependent cutoff allowed the researchers to exclude 5.7% more patients (absolute increase [95% CI:4.1-7.8%]). The fixed cutoff of 750 μg/L in patients 60 years of age and older fared similarly, excluding 5.4% more patients (absolute increase [95% CI:3.8-7.4%]). Notably, this cutoff threshold did not miss any extra cases.

The role of measuring the D-dimer concentration is often questioned; this study provides validation that in elderly patients with a measurement above the usual dichotomous cutoff of 500 μg/L could still be useful in ruling out DVTs.

Jumping to the Archives, online readers found another prediction tool: favorable neurological outcomes in survivors of in-hospital cardiac arrest.[4] This was also a cohort study including ~43,000 patients who survived an in-hospital arrest outside of the ER, OR, or procedure areas. The primary outcome was favorable neurological survival to discharge. The predictive variables included age, arrest rhythm, time to defibrillation, pre-arrest cognitive state (a retrospective estimation post-arrest), level of monitoring, hospital location, duration of resuscitation, and comorbid factors, which the authors compiled into the Cardiac Arrest Survival Postresuscitation In-hospital (CASPRI) score. The nomogram comparing the score and the percentage of neurological survival appears to be sigmoidal, with survival’s precipitously dropping as scores rise from 0, and higher scores’ (20-40) having less than 20% favorable outcomes.

While this model performed well in the validation group, the utility of the score is narrow. As noted in the invited commentary, the only factor physicians may be able to control is how and whether the patient was monitored.[5] Furthermore, since the score must be calculated after the arrest and survival, the CASPRI score is not designed to predict the outcome of the arrest itself; rather, it first predicts a favorable neurological outcome at discharge, which must then be extrapolated to the resultant subsequent quality of life.

The last study I’ll highlight was in The Lancet, which I found interesting for its content and its type of analysis.[6] The authors used a Mendelian randomization. This method is relatively new (first described in 1991) and aims to approximate a randomized controlled trial from data obtained observationally. The theory is fairly straightforward: if we assume that genetic polymorphisms assort randomly and we seek to test whether a particular gene or downstream product affects a disease, then inheriting a variation that affects this biochemical pathway should also affect the presence or extent of the disease. This method takes advantage of the inherent genetic randomization that occurs in nature without the need to randomize individuals for a study a priori.

In this study, the authors used single nucleotide polymorphisms (SNPs) associated with plasma HDL concentration to test whether elevated levels protect against myocardial infarction (MI). Some 50,000 participants were studied, of which over 4,000 suffered an MI. They validated both the population and the method by testing a known correlation: high LDL increases the risk of MI. They subsequently tested a specific SNP in the endothelial lipase gene that raises HDL cholesterol without changing other lipid levels. The rise of HDL by virtue of this SNP should have been expected to decrease the risk of MI 13% (95% CI:9-16%), but surprisingly, there was no association between this SNP and MI (OR 0.99 [95% CI:0.88-1.11]). Similarly, 14 other SNPs associated with a rise in HDL were not associated with a decrease in MI risk. These data challenge the dogma that HDL is the good cholesterol that “helps”.

Mendelian randomization obviously has its limitations; not all SNP distributions are random if genes are linked, and many genes are pleiotropic. Furthermore, while the authors were fortunate to have found a specific SNP that affects HDL without affecting several other serum and clinical measures, a candidate gene or polymorphism is not always available. Still, as the human genome repository grows, we grow closer to testing genotypes and markers as surrogates for the patients themselves. These five articles have outlined that while reducing a patient to a numerical score can simplify complex cases, the idea of personalization is still relevant as confounding factors abound. Molecular genetic fingerprinting such as SNPs may allow physicians in the coming years to offer patients even more tailored advice as they seek to assign risk and prognosticate.

Dr. Sagar Mungekar is a 1st year resident at NYU Langone Medical Center

Peer reviewed by Ishmeal Bradley, MD, section editor, Clinical Correlations

Image courtesy of Wikimedia Commons


1. Lee DS, et al. Prediction of Heart Failure Mortality in Emergent Care. Ann Intern Med 2012;156:767-775.

2. Earley A, Miskulin DM, Lamb EJ, Levey, AS, Uhlig, K. Estimating Equations for Glomerular Filtration Rate in the Era of Creatinine Standardization. Ann Intern Med 2012;156:785-795.

3. Schouten HJ, et al. Validation of two age dependent D-dimer cut-off values for exclusion of deep vein thrombosis in suspected elderly patients in primary care. BMJ 2012;344:e2985.

4. Char PS, et al. A Validated Prediction Tool for Initial Survivors of In-Hospital Cardiac Arrest. Arch Intern Med 2012; [Published online]: e2-e7.

5. Huszti E, Nichol G. Prediction of “mostly dead” vs “all dead” after in-hospital cardiac arrest. Comment on “A validated prediction tool for initial survivors of in-hospital cardiac arrest.” Arch Intern Med 2012. [Published online]: e8.

6. Voight, BF, et al. Plasma HDL cholesterol and risk of myocardial infarction: a Mendelian randomisation study. The Lancet, 2012; Epub May 17, 2012, DOI:10.1016/S0140-6736(12)60312-2.

Primecuts – This Week In The Journals

June 5, 2012

By Joshua Strauss, MD

Faculty Peer Reviewed

Temperatures have been rising in New York this past week, with Memorial Day highs approaching 90 degrees. This was a great setup for trips to the shore, barbecuing, sitting outside… and drinking an enormous regular soda?

Maybe not. For several days this past week, health news in New York was dominated by the mayor’s announcement of his plan to ban sugary drinks (such as sodas, sweetened iced teas, and energy drinks) larger than 16 fluid ounces from restaurants, movie theatres, sports arenas and street carts in New York City. The effort comes as part of Mayor Bloomberg’s continued push to address issues of public health in the city, including bans on smoking in restaurants and prohibitions against artificial trans fat in restaurant food. The plan has provided ample comedic fodder to late night shows, but whether or not you believe Mayor (err… Nanny?) Bloomberg is overreaching, two things seem clear to me: (1) Drinking more than 16 ounces of regular soda in one sitting is not good for you, and (2) New York City – where more than half of adults are obese or overweight – has a weight problem and something needs to be done about it.

As the vociferous debates over “sodagate” receded, it seemed to me that health news in the popular media became inundated with new data on cancer drugs. This was actually not my bias as a budding oncologist, but rather a direct result of presentations and article releases coinciding with the annual meeting of ASCO (American Society of Clinical Oncology) in Chicago. The biggest headline-grabbers were the drugs anti-PD1, T-DM1, and Trametinib.

– Treatment targeted against programmed death 1 (PD-1) protein – a mediator of immunosuppression – and one of its ligands (PD-L1) was shown to induce tumor regression in patients with advanced non-small cell lung cancer, melanoma or renal cell cancer. In other words, by removing one of the body’s innate preventions against immune over-response, the immune system is unleashed to fight against tumor cells. One of the studies looked at 300 adults with advanced cancers who were treated with anti–PD-1 antibody for up to 2 years. Objective tumor responses occurred in 18% of patients with non–small-cell lung cancer, 28% with melanoma, and 27% with renal-cell cancer. More than half of responses lasted a year or longer. In the second study, some 200 patients were treated with anti–PD-L1 antibody. Again, objective responses were observed. The findings, presented at ASCO and published in the New England Journal of Medicine, represent an important advance in harnessing immune responses in the treatment of cancer. In particular, this drug is also a breakthrough because it is active against lung cancer, which has been notoriously resistant to immunotherapy, and because the response rates are almost double those of previous immunotherapies. According to the NEJM editorialist, the use of PD-1 blockade, “may well have a major effect on cancer treatment.”

– A new drug for breast cancer is being described as a “heat-seeking missile” or a “smart bomb.” The drug T-DM1 (trastuzumab emtansine) is a monoclonal antibody joined to a powerful chemotherapy, one of a new breed of therapies called “antibody drug conjugates.” By combining trastuzumab and emtansine, T-DM1 both disrupts HER2 signaling and transports the cytotoxic agent DM1 directly into tumor cells. It appears superior to standard treatment in Her-2 positive locally advanced or metastatic breast cancer: T-DM1 met its pre-specified progression free survival goal by delaying disease progression for a median of 9.6 months, as compared with 6.4 months. It approached but did not meet its coprimary endpoint of overall survival, but there was still an apparent survival benefit in the T-DM1 arm. Importantly, since the cytotoxic drug is inactive until it reaches the tumor, the side effect profile of the drug was favorable. The results of this phase III trial were also presented at ASCO over the weekend.

– Advanced melanoma has long been resistant to treatment, but several recent advances have provided some optimism. In 2010, ipilimumab immunotherapy (Yervoy) was approved by the FDA to improve overall survival in this disease. On its heels in 2011, the BRAF kinase inhibitor vemurafenib (Zelboraf), was approved to better overall survival in previously untreated metastatic melanoma with the BRAF V600E mutation. Now, a third drug has arrived: Trametinib. Results of a phase III trial for this oral selective MEK inhibitor were published online this week in the NEJM and also demonstrate a survival advantage: Median progression-free survival was 4.8 months in the trametinib group and 1.5 months in the chemotherapy group. After so many years without a significant treatment, it seems there have been true breakthroughs in the treatment of this deadly disease.

Moving on from cancer drugs, I was intrigued by another study published in the New England Journal of Medicine this week entitled, “Nighttime Intensivist Staffing and Mortality Among Critically Ill Patients.” (Perhaps my intrigue was related to my imminent rotation in the medical ICU!) The study comes in the context of many hospitals revisiting their approach to ICU staffing, with recent studies consistently showing that daytime intensivist physician staffing is associated with improved outcomes among ICU patients. It is being considered that perhaps extending the intensivist staffing to include nighttime hours would improve outcomes even more so. The study was conducted retrospectively and data were obtained from the Acute Physiology and Chronic Health Evaluation (APACHE) clinical outcomes database. Data were analyzed from 25 hospitals and approximately 65,000 ICU admissions – 14,000 of which were at hospitals with nighttime intensivists (defined as “an intensivist attending physician who was physically present in the ICU or elsewhere in the hospital and immediately available to manage ICU emergencies during nighttime hours”) and 51,000 of which were without nighttime intensivists. The results showed that when stratified into groups of low-intensity daytime staffing (optional consultation with the intensivist) and high-intensity daytime staffing (mandatory consultation with the intensivist or primary transfer of care to the intensivist) , the addition of a nighttime intensivist only improved risk-adjusted in-hospital mortality in the former. When the definition of “nighttime intensivist” was broadened to include resident physicians, nighttime staffing reduced in-hospital mortality in both groups. Thus, while the study has several limitations, it suggests that in ICU’s staffed by resident physicians, the addition of an overnight intensivist confers little incremental benefit in reducing ICU mortality.

The study is interesting to me because both models are represented in the hospitals affiliated with the NYU medicine residency program and I was unaware that any data existed comparing the models. It furthered my thinking about the dearth of evidence for any of the training models and coverage schemes we currently employ in our hospital systems and the important need for studies like this one to sort out the issues and optimize care.

In the same NEJM issue, a perspective piece analyzes recent residents views regarding the 2011 duty hour regulations mandated by the ACGME Duty Hours Task Force. The authors note that “twice as many residents reported receiving better supervision as reported receiving worse supervision (17.9% vs. 8.3%), though the availability of supervision was overwhelmingly thought to be unchanged (73.8%). Although 42.8% of residents reported no change in the quality of education, a nearly equal proportion (40.9%) reported worsened education — a far greater number than those who saw improvement (16.3%). Similarly, a majority (51.5%) of residents believed that preparation for more senior roles was worse.”

These surveys suggest that a one size-fits-all approach may not be adequate or appropriate for all

trainees and training programs. As in the case of the overnight intensivist, real study is needed to optimize our training models and coverage schemes to optimize safety and quality of care for our patients, as well as education for our residents.

Dr. Joshua Stauss is a 4th year resident, internal medicine at NYU Langone Medical Center

Peer reviewed by Robert Gianotti, MD, Associate Editor, Clinical Correlations

Image courtesy of Wikimedia Commons


Sugar Drinks




Nighttime Intensivists

Resident Duty Hours

Primecuts – This Week In The Journals

May 14, 2012

By Robert Mocharla, MD

Faculty Peer Reviewed

Both New York and the world had a busy last few days this week. We all shared a collective cringe when JP Morgan announced a monstrous financial loss in an already volatile (to put it lightly) market. However, with the news that the Rangers took game 7, we were invited to live in the moment for just a bit longer and turn the city back into a hockey town for a few more days. On the medical side, there was much to report as well. Increasing attention surrounding long-term bisphosphonate use has brought several key bisphosphonate studies into the limelight. Additionally, long-awaited results of a trial involving out of hospital ACS treatment were released this week (IMMEDIATE Trial). Finally, and a little bit on the lighter side (pun intended), a meta-analysis was released this past week detailing the effect of probiotic administration for the prevention and treatment of antibiotic associated diarrhea.

Osteoporosis has posed a longstanding challenge to both patients and physicians, as it is a significant cause of morbidity and mortality among the aging population. In 1996, to the delight of all, the FDA approved the use of bisphosphonate therapy for the treatment of osteoporosis, and since that time, studies have consistently shown its effectiveness in the prevention of osteoporotic fractures.[1] However, after increasing reports in the 2000s of rare, but significant, adverse events—including osteonecrosis of the jaw, femoral diaphyseal fractures, and esophageal cancer–the FDA commissioned a systematic review of the long-term data regarding bisphosphonate use. The report was initially released in late 2011 [2]; however, a recent New England Journal of Medicine Perspective has brought the topic back to the headlines this past week. [3]

It is important to note several points before proceeding. First of all, the adverse events of concern are extremely rare (and assumed to be from cumulative bisphosphonate exposure), and few previous studies specifically examined these events as endpoints. Several studies did show an increased incidence of the above-mentioned events, but were unable to provide causation. Additionally, most of these studies were no longer than 3-4 years, making it difficult to accurately assess the long-term risk. Furthermore, bisphosphonates can persist in the bone tissue for years after discontinuation, theoretically exerting a prolonged positive effect on bone density. Therefore, the FDA also approached the safety of bisphosphonate therapy with regards to when therapy can be safely discontinued without significantly increasing fracture risk, and minimizing unnecessary bisphosphonate exposure. To do this, the FDA reviewed four of the major bisphosphonate studies. Each study followed a different bisphosphonate (Fosamax, Actonel, Boniva, Reclast). The most cited study among the report (and subsequent criticism) involved the FLEX trial, as it had the longest follow-up (10-years). [4] Interestingly, the study found that rates of vertebral and non-vertebral osteoporotic fractures were nearly the same, regardless of whether patients took a bisphosphonate (alendronate) for the full duration of 10 years or were switched to placebo from years 5-10 (rates of fracture: 17.7% vs 16.9%, respectively). For now, the bottom line is that there is an increased incidence of significant (but rare) adverse events with long-term bisphosphonate use. The good news is that the discontinuation of bisphosphonate therapy after several years may not significantly raise the likelihood of subsequent osteoporotic fractures. Further research into the topic will hopefully elucidate the optimal time for therapy while minimizing the risk of side effects.

Now, changing gears to other news, the results of the IMMEDIATE Trial were recently released. [5] The trial, titled, “Out-of-Hospital Administration of Intravenous Glucose-Insulin-Potassium in Patients With Suspected Acute Coronary Syndromes,” was a randomized-controlled trial comparing the effectiveness of intravenous glucose-insulin-potassium v. placebo given immediately upon suspected ACS in the out of hospital setting. The authors, drawing on previous data from in-hospital studies that showed a reduction in myocardial ischemia and progression to malignant arrhythmias after the administration of IV glucose-insulin-potassium, hypothesized that even earlier administration of this combination could reduce progression of unstable angina to myocardial infarction (primary endpoint) and cardiac arrest, mortality, heart failure (secondary endpoints). The study took place from 2006-2011, involved 13 U.S. cities, and examined over 900 patients. The key to the study was that trained paramedics were able to give the IV treatment immediately upon high suspicion of ACS. With regard to primary endpoint, there was no significant difference between treatment and placebo arms (48.7% v. 52.6%, 95% CI 0.66-1.13, p=.28). However, analysis of the composite endpoint of cardiac arrest or in-hospital mortality did show a statistically significant benefit (4.4% v. 8.7%, 95% CI 0.27-0.85, P=.01). The authors concluded that, although this treatment did not retard progression to MI, it may decrease the likelihood of developing ischemia-related malignant arrhythmias. Finally, the authors also analyzed median infarct size among the two study arms (2% of LV mass in those receiving GIK vs. 10% in placebo, p=0.1). The authors tie the results together by noting that IV GIK reduces serum free fatty acid levels which are known to accumulate peri-infarction and promote arrhythmias. Though the authors aren’t necessarily convinced that intravenous GIK given in the out-of-hospital setting is beneficial, they did note that it was shown not to be harmful, and that further study can better elucidate what populations, if any, could benefit from the therapy.

Moving on, an interesting meta-analysis was published this week in JAMAs, titled, “Probiotics for the Prevention and Treatment of Antibiotic-Associated Diarrhea.” The authors concluded that probiotics, when used as adjunctive therapy, have consistently proved beneficial in the prevention of antibiotic-associated diarrhea. 6 Antibiotic-associated diarrhea is a mere inconvenience for some, but to others, it can prove life-threatening. The underlying mechanism is largely thought to be related to the disruption of normal gut flora during. In the recently published meta-analysis, the reviewers pooled 82 randomized-control trials from the available literature. The reviewers allowed for a wide variety of trials (including varied reasons for initial antibiotic therapy, types of antibiotics given, and strain of probiotic administered). The main conclusion of the analysis showed an average risk reduction of 0.58 with a number needed to treat of only 13. However, the most interesting point of the analysis appears to be the lack of any comprehensive studies addressing probiotics as potential prophylaxis. The reviewers noted that only 10% of the studies they reviewed were even powered enough to statistically evaluate the effect of probiotics. Moreover, the majority of available studies often lacked key information for widespread clinical application. If anything can be learned from this analysis, it seems that more research is necessary with regard to this inexpensive and potentially life-saving therapy.

So, with this week’s Primecuts, it seems that more questions have been generated than answered. There is still no clear consensus regarding bisphosphonates or probiotics, and there most certainly will be numerous studies in the near future attempting to shed light on these therapies. Until then, we will have to make educated decisions based on the most current evidence. Go Rangers.

Dr. Robert Mocharla is a 1st year resident at NYU Langone Medical Center

Peer reviewed by Ishmeal Bradley, section editor, Clinical Correlations

Image courtesy of Wikimedia Commons


1. Nieves JW, Fragility fractures of the hip and femur: incidence and patient characteristics. Osteoporosis Int 2010. 21:399-408.


3. Whitaker M, et al. Bisphosphonates for Osteoporosis – Where Do We Go from Here? NEJM, 2012 May 9. [Epub ahead of print].

4. Black DM, Schwartz AV, Ensrud KE, et al. Effects of continuing or stopping alendronate after 5 years of treatment: the Fracture Intervention Trial Long-Term Extension (FLEX): a randomized trial. JAMA 2006; 296:2927-38.

5. Selker HP, et al. Out-of-Hospital Administration of Intravenous Glucose-Insulin-Potassium in Patients With Suspected Acute Coronary Syndromes. JAMA. 2012 May 9; 307(18):1925-33.

6. Hempel S, et al. Probiotics for the prevention and treatment of antibiotic-associated diarrhea: a systematic review and meta-analysis. JAMA. 2012 May 9; 307(18):1959-69.

Primecuts – This Week In The Journals

May 7, 2012

By Jessica Taff, MD

Faculty Peer Reviewed

With the thrills of the Kentucky Derby and Cinco de Mayo now behind us, we are left awaiting the May flowers promised by an abundance of April showers. In our state of anticipation for the warm days of summer, popular activities appear to be intersecting with modern medicine. This past week, Facebook introduced a campaign to urge members to announce their organ donor status on the social networking site in a hope that this issue would creep into popular consciousness and peer pressure would urge more people to openly declare their donor status. [1] While this endeavor is certainly noteworthy, medical literature has brought other topics to the forefront as well. A series of articles in the Annals of Internal Medicine is attempting to redefine risk factors for breast cancer and clarify the initiation of breast cancer screening in women ages 40-49. The Archives of Internal Medicine also published two articles focusing on subclinical thyroid states and the related mortality. Lastly, Primecuts would not be complete without a foray into cardiology to examine the use of aspirin versus warfarin in patients with heart failure.

Elaborating on the trend of popular behaviors, it seems that most women anticipate referrals for a yearly mammography after age 40 as a primary mode of breast cancer screening. Screening guidelines by the U.S. Preventive Services Task Force were last modified in 2009 to recommend biannual mammography for women ages 50-74. Recommendations for screening women ages 40-49 have since been unclear. Absolute benefits of screening these women are smaller because of both lower incidence of breast cancer and lower mammographic sensitivity in this population. Thus, screening results in higher false positive results and unnecessarily harmful workups. Because of this, the USPSTF now recommends that the decision to start mammographic screening in women younger than 50 be an individualized choice, although guidance for individualization is lacking. The Annals of Internal Medicine has now published two jointly conducted studies evaluating the harms and benefits associated with screening women in this controversial decade.

The first, a comparative modeling study, seeks to identify the threshold relative risk (RR) that would tip the balance of harms and benefits toward recommending mammographic screening for women ages 40-49.[2] By using previously accepted models developed by the National Cancer Institute, based on a cohort of women born in 1960 and followed throughout their lifetime, the study aims to clarify both screening method (film vs. digital mammography) and interval (annual vs. biennial) that would maximize benefit and reduce harm in screening this population. The screening scenarios modeled included (1) a control group of women ages 50-79 receiving biennial film mammographies alone and then combined with screening from ages 40-49 with (2a) biennial film mammographies, (2b) annual film mammographies, (3a) biennial digital mammographies or (3b) annual digital mammographies. In the absence of screening, the models estimate that a median of 153 cases of breast cancer would be diagnosed among 1000 women over age 40. Biennial film mammography, as is currently recommended, prevents 6.3 breast cancer deaths, with a gain of 109 life-years at the cost of 883 mammographies read as falsely positive. In all models, adding annual or biennial mammographic screening for average-risk women ages 40-49 leads to slight increases in life-years gained and breast cancer deaths averted, but at the expense of greater increases in incremental harm, particularly due to false positives and the subsequent workup. Yet in women with an

approximately 2-fold increased risk for breast cancer, there was a more favorable balance of benefits and harms (median threshold RR of 1.9, with range across models from 1.5-4.4). For these women, initiating biennial mammographic screening at age 40 approximates the benefit –harm ratio of average-risk women starting screening at 50 years of age. All models showed that the addition of annual screening or digital mammography use (threshold RR 4.3, range 3.3-10) for this 10-year period in question resulted in substantially more false-positive results than did biennial film mammography.

To clarify the patient characteristics that define a 2-fold increase in risk for breast cancer, a joint study published in The Annals provided a systematic review and meta-analysis of risk factors for breast cancer and their prevalence rates in women ages 40-49 in the United States.[3] This review included 66 English-language systematic reviews of risk factors for breast cancer in women aged 40-49. Studies were limited by variation between measures and reference groups, and results were summarized by a meta-analysis based on study quality, size, and applicable data. At least a 2-fold increase in risk for breast cancer is associated with women with extremely dense breasts (13% of the population studied) and women with first-degree relatives with breast cancer (9% of the study population). Several factors were also found to be associated with a 1.5- to 2.0-fold increased risk, but it should be noted that any combination of these factors could result in an increased risk that meets the 2.0-fold RR threshold. These factors include prior breast biopsy, second-degree relatives with breast cancer, or heterogeneously dense breasts. Similarly, current use of oral contraceptives, nulliparity, or age greater than 30 at first birth were associated with a 1.0-to 1.5-fold increased risk. Understanding these risk factors in the context of screening recommendations allow us to translate statistical findings into more useable information that can help us to better personalize recommendations for initiating mammographic screening.

Changing our focus and moving superiorly in anatomy, the thyroid gland is also in the forefront of this week’s medical literature in the Archives of Internal Medicine.[4]  An article on subclinical hyperthyroidism (SH) attempts to elaborate upon the long held belief that treatment is necessary in order to negate the long-term and increased risk of total mortality, coronary heart disease (CHD) mortality, CHD events, and atrial fibrillation. Individual studies on this topic have thus far been contradictory in attempting to determine if treating SH does indeed alter mortality. This previously inconclusive data is attributed to ecological fallacy, or the potential bias from study-level meta-analysis in which a correlation observed at the population level is assumed to apply at the level of the individual. To avoid this, the current study design pooled a total of 52,674 participants from 10 longitudinal cohort studies that reported baseline thyroid function tests (both thyrotropin and Free T4[FT4]) and excluded patients on thyroid-altering medications, with overt hyperthyroidism, or those studies not using a uniform thyrotropin cutoff. Primary outcomes examined were total mortality, CHD mortality, CHD events, and incident Atrial Fibrillation (AF). Endogenous subclinical hyperthyroidism was defined as thyrotropin levels lower than 0.45 mIU/L with normal FT4 levels. Of the participants, 2188 (4.2%) were found to have endogenous SH, with a subgroup of 304 (0.6%) having thyrotropin of less than 0.10 mIU/L. During follow-up, 8527 participants died (including 1896 from CHD) and 3653 had CHD events. A total of 785 had incident AF. In age-and sex-adjusted analysis, the risk for those with subclinical hyperthyroidism compared to those who were euthryroid was increased 24% for overall mortality, 29% for CHD mortality, and 69% for incident AF. In patients with thyrotropin levels less than 0.01mIU/L, with data adjusted for age and sex, there was a significantly greater risk of CHD mortality and incident AF. The HR was 1.84 (95% CI, 1.12-3.00) in the subgroup compared to 1.24 (95% CI, 0.96-1.61) for all patients with SH and 2.54 (95% CI 1.08-5.99) versus 1.63 (95% CI, 1.10-2.41) for CHD mortality and AF respectively. Further, the risk attributable to SH, after accounting for traditional cardiovascular risk factors, was 14.5% for total mortality and 41.5% for incident AF. These results of pooled individual participant data draw a convincing argument for the increased risk of total and CHD mortality with SH. Despite this, current guidelines from the American Thyroid Association remain vague in their recommendation to treat subclinical hyperthyroidism only in patients older than 65 years and with thyrotropin levels lower than 0.10 mIU/L.iv Yet with the benefit of this new study data and knowledge of long term risks of untreated SH, we can certainly heed the recommendation to consider treating patients outside of these guidelines.

Continuing discussion of the thyroid, a second article in the Archives of Internal Medicine moves our focus to subclinical hypothyroidism. (SCH).[5] Defined by serum thyrotropin levels of 5.01 to 10.00 mIU/L and normal FT4 levels, SCH is estimated to occur in approximately 10% of the adult population. SCH also has a known association with ischemic heart disease (IHD) and other cardiovascular risk factors, including elevated total and LDL-cholesterol concentrations. Several months ago, an article in Clinical Correlations evaluated the recommendation for SCH screening. [6]  Although there are no uniformly accepted guidelines, general consensus is to periodically screen at risk patients, including pregnant women and those with vague symptoms of fatigue or depression, with a strong family history of autoimmune thyroid disease, or with type I diabetes. However, if screened and found to have SCH, it remains unclear whether treatment with Levothyroxine will actually decrease the incidence of IHD. The Archives published a retrospective study using the United Kingdom General Practitioner Research Database, known to be a very reliable and comprehensive source of clinical data, which analyzed outcomes of patients with SCH treated during the period of 2001 to March 2009. Analysis separated patients into younger (ages 40-70 years) and older (>70 years) subgroups. SCH was found in 3093 younger and 1642 older individuals, who were then followed for a median period of 7.6 years. Of these groups, 52.8% of the younger and 49.9% of the older patients with SCH were treated with Levothyroxine. In the younger group, there were 68 incident fatal and nonfatal IHD events (4.2%) versus 97 events (6.6%) in the nontreated group (multivariate-adjusted HR 0.61, 95% CI, 0.39-0.95). A temporal analysis calculated the HR as 0.989 (95% CI, 0.986-0.993) for each month of exposure to levothyroxine. These results contrasted greatly to findings in the older group, in which treated patients had 104 (12.7%) fatal and nonfatal IHD events and untreated patients had 88 (10.7%) events (adjusted HR 0.99; 95% CI, 0.59-1.33). In both groups, incident AF was not found to be associated with levothyroxine exposure. Thus, treatment of patients with SCH younger than age 70 does result in improved outcomes with respect to incident nonfatal and fatal events and overall mortality, but these benefits do not extend to our older patients. So while we may now feel reassured with a decision to treat younger patients with SCH, the question of whether to treat our older patients may depend on the need for symptomatic improvement alone.

Shifting gears away from the thyroid, the NEJM this week published WARCEF (Warfarin versus Aspirin in Reduced Cardiac Ejection Fraction) to weigh in on the debate of warfarin or aspirin in patients with heart failure in sinus rhythm.[7] Heart failure is known to be associated with a hypercoaguable state, so there has long been rationale for using oral anticoagulants as a preventive measure in these patients. Studies completed thus far have shown either no significant difference between warfarin and aspirin for this use or have lacked sufficient data and power to draw a conclusion.[8,9,10]  The WARCEF study is a double-blind multi-center clinical trial that occurred at 168 centers in 11 countries. Patients were adults recruited between 2002 and 2010, in normal sinus rhythm, without contraindication to warfarin therapy, and with a left ventricular ejection fraction of less than 35%. Participants were then randomized to receive warfarin plus a placebo aspirin or a placebo warfarin plus a 325mg Aspirin daily and all patients received INR testing. A total of 2305 patients were followed for up to 6 years for a primary outcome of time to first event (ischemic stroke or intracerebral hemorrhage) or death from any cause. The rates of primary outcome were 7.47 events per 100 patient-years in the warfarin group and 7.93 in the aspirin group (HR with Warfarin, 0.93; 95% CI, 0.79 – 1.10, p=0.4), which illustrates no significant difference between the treatment groups. In a time-varying analysis the HR decreased by a factor of 0.89 per year (95% CI, 0.80-0.998), slightly favoring warfarin over aspirin by the fourth year of follow-up but with only marginal significance (p 0.046). Although warfarin was associated with a significant reduction in the rate of ischemic stroke throughout the follow-up period (0.72 events per 100 patient-years versus 1.36 per 100 patient-years), this difference was offset by an increased rate of major hemorrhage (1.78 events per 100 pt years vs. 0.87 events per 100 pt years with aspirin). The rates of intracranial and intracerebral hemorrhage in both groups were comparable (0.27 events with warfarin vs. 0.22 events with aspirin per 100 patient-years, p 0.82). In sum, although there may have been a small benefit with warfarin for patients followed for greater than 4 years, the risk for hemorrhage negates this. And while the conclusion of this study does not argue for change in our clinical practice, it does indeed bolster support for recommending aspirin to our heart failure patients in the hope of preventing hypercoagable events.

From scanning the week in medical literature, there is an overarching theme: With new data, medicine is becoming more and more personal. As physicians, we have a duty to our patients and a challenge to ourselves to think critically about people on an individual level. We can ask “Where do you fit in?” or “How can I serve you best?” and realize that guidelines exist to provide just that– a handrail but not a paved path. We must think deliberately about each decision. Supporting evidence changes so rapidly, yet we still must trust our instincts, and continue to question each step as we go while having the courage to advance forward.

Dr. Jessica Taff is a 1st year resident at NYU Langone Medical Center

Peer reviewed by Barbara Porter, Section Editor, Clinical Correlations

Image courtesy of Wikimedia Commons


1. Donate Facebook Partners with Donate Life America to Dramatically Increase Number of Registered Organ Donors. 2012 May 1.

2. Van Ravesteyn NT, Miglioretti DL, Stout NK, et al; Tipping the Balance of Benefits and Harms to Favor Screening Mammography Starting at Age 40 Years: A Comparative Modeling Study of Risk. Annals of Internal Medicine. 2012, May; 156(9): 609-617.

3.  Nelson HD, Zakher B, Cantor A, et al. Risk Factors for Breast Cancer for Women Aged 40 to 49 years: A Systematic Review and Meta-analysis. Annals of Internal Medicine. 2012 May; 156(9): 635-648.

4.  Bahn RS, Burch HB, Cooper DS, et al; American Thyroid Association; American Association of Clinical Endocrinologists. Hyperthyroidism and other causes of thyrotoxicosis: management guidelines of the American Thyroid Association and American Association of Clinical Endocrinologists. Thyroid. 2011; 21(6): 593-646.

5.  Razvi S, Weaver JU, Butler TJ, Pearce SHS. Levothyroxine Treatment of Subclinical Hypothyroidism, Fatal and Nonfatal Cardiovascular Events, and Mortality. Arch Intern Med. Epub 2012 Apr.

6.  Peretz A. Subclinical Hypothyroidism: To Screen or Not to Screen? Clinical Correlations. 2011 Aug.

7.  Homma S, Thompson JLP, Pullicino P, et al. Warfarin and Aspirin in Patients with Heart Failure and Sinus Rhythm. Epub. NEJM. 2012 May.

8.  Cokkinos DV, Haralabopoulos GC, Kostis JB, Toutozas PK. Efficacy of antithrombotic therapy in chronic heart failure: the HELAS study. Eur J Heart Fail. 2006; 8: 428-32.

9.  Cleland JGF, Findlay I, Jafri S, et al. The Warfarin/Aspirin Study in Heart Failure (WASH): a randomized trial comparing antithrombotic strategies for patients with heart failure. Am Heart J. 2004; 148: 157-64.

10.  Massie BM, Collins JF, Ammon SE, et al. Randomized trial of warfarin, aspirin, and clopidogrel in patients with chronic heart failure: the Warfarin and Antiplatelet Therapy in Chronic Heart Failure (WATCH) trial. Circulation. 2009; 119: 1616-24.

Primecuts – This Week In the Journals

May 1, 2012

By Matthew Ingham, MD

Faculty Peer Reviewed

History provides many examples of medical interventions that were intended for one use, but were ultimately found therapeutic for a wholly different purpose. A review of this week’s prominent medical journals finds a number of studies proposing new applications for established treatments, including the use of bariatric surgery for inducing the remission of diabetes, bone marrow cells to improve heart failure, aspirin for prevention of cancer metastasis, and highly active antiretroviral therapy (HAART) for preexposure prophylaxis in those most at risk for HIV infection.

Until now, only one randomized trial has suggested the superiority of gastric banding over usual medical care for the treatment of type 2 diabetes, and this trial involved patients with relatively less severe diabetes of shorter duration.[1] In the present STAMPEDE Trial,[2] Schauer et. al. of the Cleveland Clinic randomized 150 patients at that center to one of three treatment groups: intensive medical therapy alone, or medical therapy plus either Roux-en-y bypass or sleeve gastrectomy. Patients had a mean age of 48 years, mean BMI of 36 with a range of 27 to 43, and a mean duration of diabetes of 8 years, with 22% using insulin. 74% of patients were white. All patients were treated with aggressive medical therapy to achieve an A1c of less than 6%. The primary outcome was the proportion of patients with an A1c less than 6% at 12 months after randomization. This was achieved in 12% of the medical therapy group, 42% of the gastric bypass group and 37% of the sleeve gastrectomy group. The differences between medical therapy and either surgical option were highly statistically significant, but the difference between the two surgical groups was not significant. Notably, all patients in the gastric bypass group who achieved an A1c less than 6% did so without use of any medications or insulin, whereas 28% of patients in the sleeve gastrectomy group still required medical therapy.

Across all patients, mean A1c and fasting glucose were significantly lower in the surgical groups as compared the medical group at 12 months. Improvement in the surgical groups was seen at 3 months and sustained at one year. The average number of diabetes agents required for glycemic control increased in the medical therapy group but decreased in the two surgical groups, with 38% of the medical group still requiring insulin as compared 4% in the bypass group and 8% in the sleeve group. Although there was no difference in low-density lipoprotein (LDL) or mean blood pressure among groups, fewer medications were required in the surgical groups to achieve improved results. Improvement in glycemic control predated maximal weight loss, suggesting improvement in insulin resistance as the potential mechanism for improved glycemic control, which was supported by improvement in blood insulin levels in the surgical groups. Although there were no deaths among the three groups, additional surgical interventions were required in four of the surgically treated patients. The study was limited by short follow-up, a single center design at a tertiary academic center, and inadequate power and follow-up to show improvements in cardiovascular clinical outcomes.

Another New England Journal study by Mingrone et. al. compared conventional medical therapy versus gastric bypass or biliopancreatic diversion for inducing the remission of type 2 diabetes.[3] This randomized Italian study enrolled 60 patients with a BMI of 35 or more, at least a 5 year history of diabetes and an A1c greater than 7 to standard medical therapy with goal A1c less than 7, or standard medical therapy with gastric bypass or biliopancreatic diversion. The primary end point was remission of type 2 diabetes as defined by a fasting plasma glucose less than 100 or an A1c less than 6.5% for 1 year without medical therapy. The mean age of participants was 42, mean A1c was 8.6, mean BMI was 45, and the mean duration of diabetes was 6 years. At two years of follow-up, there were no remissions in the medical management group, whereas 75% of the patients in the bypass group and 95% of patients in the diversion group achieved remission, a statistically significant difference for each comparison. The average time to remission was 10 months in the bypass group and 5 months in the biliopancreatic diversion group. Age, sex, baseline BMI and A1c did not predict the ability to achieve remission. Among all patients, a significantly greater improvement in mean A1c was seen in each surgical group as compared the medical group. The lipid profile was most favorably improved by biliopancreatic diversion, except for HDL, which improved most after gastric bypass. At 2 years, total cholesterol normalized in 27% of patients in the medical therapy group as compared with all patients in the surgical groups. Blood pressure was improved in all groups, but the surgical groups required less medication at the end of follow up. Of note, there was again no correlation between the degree of weight loss and the likelihood of achieving remission amongst the surgically treated patients, suggesting that the ameliorating effects of bariatric surgery occur through means other than weight loss itself. The preoperative BMI also did not predict likelihood of remission. Limitations again included short duration of follow up, small sample size, single center design and inability to detect differences in clinical outcomes. There were no deaths in either group, but two surgical patients required re-operation.

The utility of bone marrow cells (BMCs) has been hotly pursued for the treatment of many hematologic diseases for some time. An interesting article in this week’s Journal of the American Medical Association sought to further assess the utility of transendocardial delivery of bone marrow mononuclear cells for the treatment of chronic ischemic heart disease and left ventricular dysfunction.[4] A previous phase I study by the same group had shown the safety of this method but was insufficiently powered to assess its efficacy. The present study, FOCUS-CCTRN, was a phase 2 randomized double-blinded placebo controlled trial enrolling patients with clinically stable coronary disease, an LVEF less than 45% and NYHA class II-III heart failure with perfusion defects on SPECT imaging but no anatomy suitable to revascularization. 92 patients were randomized in a 2:1 ratio to transendocardial delivery of either 90 to 100cc of BMC from the iliac crest or a placebo sham procedure. BMCs were delivered to left ventricular regions identified as viable by electromechanical sampling. The mean age was 63, LVEF was 31%, and 95% of patients were male. The primary outcomes were left ventricular end-systolic volume (LVESV), maximal oxygen consumption as measured by the Naughton treadmill method and defect size on SPECT at 6 months after delivery as compared to baseline. At its conclusion, the study showed no difference in any of the primary outcomes, or in NYHA class, need for medications for angina or BNP. However, at six months, there was significant improvement in LVEF and stroke volume in the BMC group as compared the placebo group, although the improvements were small (1.4% as compared -1.3%, 2.7mL as compared 5.8mL, respectively). Interestingly, the authors found that improvement in LVEF correlated with the proportion of CD34 and CD133 cells in the delivered sample. These cells are known to be involved in cell survival and give rise to vascular and endothelial progenitor cells. The study was limited by sample size, especially given that power calculations were based on likely unrealistic estimates of improvement (LVESV of 27mL, SPECT reversibility of 10 percentage points). Based on the data the authors provided, there were no adverse events clearly associated with BMC therapy.

The benefits of aspirin in the primary and secondary prevention of cardiovascular disease have been well studied, and the rich data associated with these studies has now made it possible to study the effect of aspirin on other disease processes. In the current issue of the Lancet, Rothwell et. al. conduct a meta analysis evaluating the effect of aspirin on the risk of cancer metastasis.[5] These authors noted that in other recent studies on aspirin and cancer, the effect of aspirin on cancer mortality was greater than the effect on cancer incidence and also that the mortality reduction occurred too quickly to be due to primary prevention alone. As a result, the group hypothesized that aspirin may also affect the metastatic progression of established occult cancers at the time of study enrollment, which they endeavored to show in this trial. Rothwell analyzed the 5 major randomized trials of aspirin versus placebo control for the primary prevention of vascular events in the United Kingdom. Through detailed chart review and communication with the primary authors, data were extracted for all cases of incident cancer during the trials, including the date of diagnosis of solid cancer, site of primary, staging and treatment, presence of metastasis at diagnosis and follow-up and mortality due to cancer or other causes. Hematologic cancers and primary brain cancer were excluded. Among the five trials included, there were 17,285 participants, 1,101 incident cancers and 563 deaths due to cancer. Definitive determination of metastatic disease could not be made in 20% of the studied cases. Aspirin therapy was found to reduce the incidence of any new cancer during the trial (OR 0.88 p=0.04) but to have an even greater effect on the risk of death due to incident cancer (OR 0.77 p=0.002). In support of the author’s hypothesis, aspirin was found to reduce the risk of cancer with definite metastatic disease at the time of diagnosis (HR 0.64 p=0.001) and conversely increase the risk of cancer with local disease only. There was no predilection to reduce the risk of metastasis by any particular metastatic site. More detailed analysis showed that the protective effect of aspirin on metastatic cancer at diagnosis was significant for colon cancer, with trends towards significance in other adenocarcinomas, but without a significant effect or trend in non-adenocarcinomas. In patients with adenocarcinoma who did not have metastatic disease at diagnosis, the risk of later metastasis was reduced by aspirin (HR 0.45, p=0.0009) with the effect most pronounced in colon cancer patients. Overall survival was higher in aspirin groups as compared controls for death due to any cause and death due to cancer. The authors hypothesize that aspirin’s protective effect is mediated through platelets, which are thought to play a facilitating role in hematogenous cancer metastases. Study limitations include the 20% of cancers identified during the study where the status of metastatic disease was uncertain. One may also wonder whether patients on aspirin were more likely to have cancer detected earlier. For example, might aspirin have predisposed to bleeding events and thus diagnostic imaging which could have uncovered cancer earlier?

HAART has extended the lives of millions of patients living with HIV/AIDS. The recent iPrEx study suggests that prophylactic pre-exposure HAART with daily tenofovir-emtricitabine (PrEP) may reduce the risk of HIV transmission in adherent high-risk MSM populations by up to 73%.[6] Many questions about PrEP remain, however, including those concerning the cost-effectiveness of this intervention and its potential for encouraging high risk behavior among MSM. A recent study in Annals by Jussola uses a complex mathematical model to address the cost-effectiveness of PrEP.[7] The authors adapted a deterministic dynamic compartmental model of HIV transmission and progression. Although the mathematical details of the model are beyond our scope, model parameters included HIV prevalence, assumptions about sexual behavior and different risk populations within the MSM population, condom use, screening, infectivity based on treatment or lack of treatment, effectiveness of PrEP, costs of testing, counseling and treatment, and costs of HIV diagnosis, ART and PrEP, among others. The authors concluded that initiating PrEP in 20% of the undifferentiated-risk MSM population would save 60 thousand infections at 20 years of follow up, and would cost $172,091 per quality adjusted life year (QALY) gained. This was felt not cost effective. However, imitating PrEP in the highest risk 20% of the MSM would cost about $50,000 per quality adjusted life year (QALY), which compares favorably to other prophylactic interventions. However, the total cost would still be nearly $4 billion per year, which is high, particularly in view of competing interventions for HIV/AIDS prevention and treatment. As the authors note, the effectiveness of PrEP for high risk MSM will depend on clinician’s ability to reach those most at risk, and the adherence of this population with PrEP, which may be difficult, as this group tends to have limited contact with the medical system, a high incidence of substance abuse, and is most subject to stigma. The authors note that if PrEP is shown to be effective in pre-exposure use only (as compared with daily use), or if the cost of HAART declines, then cost effectiveness will improve further.

Many physiologic processes have effects on the pathology of multiple organ systems. The case of aspirin provides useful insight into this observation, where platelets clearly play a role in thrombosis and cardiovascular events, but may also play a role in cancer metastasis. In all the cases described above, the promise of known therapies for unknown uses requires further insight into the mechanisms of action of the interventions themselves.

Dr. Matthew Ingham is a 2nd year resident at NYU Langone Medical Center

Peer reviewed by Danise Schiliro, MD,  contributing editor, Clinical Correlations

Image courtesy of Wikimedia Commons


1.  Dixon JB. Adjustable gastric banding and conventional therapy for type 2 diabetes: a randomized controlled trial. JAMA 2008: 299; 316-323.

2.  Schauer, PR. Bariatric Surgery versus Intensive Medical Therapy in Obese Patients with Diabetes. NEJM. 366: 17. 1567-1576.

3.  Mingrone G. Bariatric Surgery versus Conventional Medical Therapy for Type 2 Diabetes. NEJM. 366: 17. 1577-1585.

4.  Perini EC. Effect of Transendocardial Delivery of Autologous Bone Marrow Mononuclear Cells on Functional Capacity, Left Ventricular Function, and Perfusion in Chronic Heart Failure. JAMA. 307: 16. 1717-1726.

5.  Rothwell P. Effect of daily aspirin on risk of cancer metastasis: a study of incident cancers during randomized controlled trials. Lancet. 379: 1591-601.

6.  Grant RM. Preexposure chemoprophylaxis for HIV prevention in men who have sex with men. NEJM. 363: 2587-99.

7.  Jussola J. The cost-effectiveness of pre-exposure prophylaxis for HIV prevention in the United States in men who have sex with men. Annals of Internal Medicine. 156. 541-550.

Primecuts – This Week In The Journals

April 23, 2012

When April with

Her showers sweet

Drives you inside to

Your window seat,


We shall excite

Synapses neural

With landmark trials

And doggerel.


This cruelest month

Should not deter

A nature curious

With thoughts astir.


So electrify

Your occiputs

With this week’s release

Of fine Primecuts.

Happy National Poetry Month!

By Todd Cutler, MD

Faculty Peer Reviewed

In 2006, the US Food and Drug Administration (FDA) approved the use of the Avastin (bevacizumab), a monoclonal antibody against vascular endothelial growth factor A (VEGF-A), in combination with the standard therapy of paclitaxel and carboplatin for the treatment of non-small-cell lung cancer (NSCLC). This decision was based on the results of a trial [1] that showed a statistically significant improvement in the median survival (12.3 months versus 10.3 months) among patients who received bevacizumab plus standard therapy when compared to patients assigned to standard therapy alone (P=0.003). This was despite a significantly greater number of deaths directly attributable to adverse effects in the treatment group including fatal pulmonary hemorrhage. Furthermore, a survival benefit was not seen in women, patients with more advanced disease and patients over the age of 65 and a subsequent meta-analysis [2] failed to show an improvement in mortality at one year.

In a study published this week in JAMA [3], a group of authors further investigated whether Avastin has any benefit in patients over the age of 65 with NSCLC. The investigators performed a retrospective analysis of a national cancer database with the primary outcome of all-cause mortality among patients who received Avastin plus standard therapy versus those who received standard therapy alone. Interestingly, the authors used two control groups to compare patients who received Avastin therapy with two standard groups diagnosed with NSCLC before and after FDA approval of Avastin, respectively. This cohort division was done out of concern that if “physicians chose bevacizumab-carboplatin-paclitaxel for their healthier patients, then a selection bias might result in better survival for those patients.”

The Avastin treatment group included 318 patients while the post and pre-FDA approval standard treatment groups had 1,182 and 2,664 patients, respectively. The Avastin treatment group had a median overall survival of 9.7 months compared to 8.9 months among patients in the post-marketing control group. The earlier control group had a median overall survival of 8.0 months. One-year survival probabilities were 39.6%, 40.1% and 35.6% between the three groups, respectively. There was no significant difference in survival between the patients treated with Avastin when compared to both the post-FDA approval group (HR, 1.01; 95% CI 0.88-1.15) and the pre-FDA approval group (HR, 0.94; 95% CI, 0.83-1.06).

The authors concluded that, “our analyses suggest that the addition of bevacizumab to carboplatin and paclitaxel is not associated with demonstrable improvement in overall survival in the Medicare population.” They also commented that finding prescription rates to be lower than one might presuppose, especially recognizing the powerful financial incentives to prescribe this expensive drug, “provides some measure of reassurance that oncologists are circumspect and judicious in their use of new agents with uncertain benefit in the Medicare population.” In a statement [4], the lead investigator of this study, Deborah Schrag, MD, MPH, noted, “Adoption of bevacizumab was by no mean universal. We didn’t find that all physicians rushed out to administer this drug.”

These results come on the heels of the FDA’s decision to revoke its approval of Avastin [5] for the treatment of breast cancer noting, “that women who take Avastin for metastatic breast cancer risk potentially life-threatening side effects without proof that the use of Avastin will provide a benefit, in terms of delay in tumor growth, that would justify those risks.” While the benefits of Avastin in certain cancers requires further elucidation, what has become evident over the past few years are the numerous adverse effects of Avastin [6] which include severe hypertension, fatal hemorrhage, vascular thrombosis and bowel perforation.

These well-documented adverse effects combined with the unclear benefit of Avastin leave no surprise that, following FDA approval, enthusiasm for Avastin has been muted. While the ultimate role for Avastin in the treatment of non-small-cell lung cancer has yet to be determined, these latest findings serve to emphasize the dearth of therapeutic options available to these patients and should remind physicians of the importance of palliative care [7] in their management.

Current classifications of breast cancer depend predominantly on a determination of the presence or absence of particular cell surface markers, specifically estrogen, progesterone and Her2 receptors, respectively. While this practice has successfully led to tailored therapies based on molecules that antagonize those receptors, the clinical response to therapy remains variable. Exciting research published this week in Nature [8] suggests that in the future these approaches for classifying and treating breast cancer are likely to change.

In what can truly be called a “bench to bedside” study, 997 samples of breast cancer tissue were subjected to a robust analysis of DNA mutations and their respective associations with gene transcription. This evaluation resulted in the identification of ten distinct molecular subgroups which were subsequently validated in a separate batch of 995 tissue samples. Each subgroup had a characteristic rate of clinical progression suggesting a more complex pattern of growth than what is now understood at a molecular level. Furthermore, the investigators identified novel mutations in molecles involved in oncogenic signaling pathways that may offer targets for future therapies. Addressing the clinical implications [9] of these findings, lead investigator Carlos Caldas, Ph.D., remarked, “We have a completely new way of looking at breast cancer…We need to carry out more research in the laboratory and in patients to confirm the most effective treatment plan for each of the 10 types of breast cancer.”

Lastly, it was announced this week [10] that Warren Buffett, the multi-billionaire investor and philanthropist, has been diagnosed with stage 1 prostate cancer following screening with prostate specific antigen (PSA) and that he will soon be undergoing radiotherapy. Following this announcement, shares in Berkshire Hathaway fell slightly [11] even though the 81-year-old chairman and CEO of the holding company is unlikely to have his life shortened by his disease. While questions are being raised regarding his ultimate successor at the company, what many want to know is why, at his age, was Mr. Buffett screened for prostate cancer in the first place. A multitude of writers and bloggers at major media outlets are asking this exact question. Writer Tara Parker-Pope [12] of the New York Times adeptly synthesizes the major points of the debate and correctly notes that Mr. Buffett is much more likely to suffer the adverse effects of radiation therapy than to incur any survival benefit from his treatment. The degree to which this news has sparked interest in the general media serves as a testament to how pervasive the topic of PSA testing in prostate cancer has become and one cannot help but get the feeling that well-informed participants have elevated the national debate over the past few years.

Dr. Todd Cutler is an associate editor, Clinical Correlations

Peer reviewed by Robert J. Gianotti, MD, NYU Chief Medical Resident, NYU Langone Medical

Image courtesy of Wikimedia Commons


[1] Sandler et al. Paclitaxel-carboplatin alone or with bevacizumab for non-small-cell lung cancer. N Engl J Med (2006) vol. 355 (24) pp. 2542-50. Available from:

[2] Yang et al. Effectiveness and safety of bevacizumab for unresectable non-small-cell lung cancer: a meta-analysis. Clin Drug Investig (2010) vol. 30 (4) pp. 229-41. Available from:

[3] Zhu et al. Carboplatin and paclitaxel with vs without bevacizumab in older patients with advanced non-small cell lung cancer. JAMA (2012) vol. 307 (15) pp. 1593-601. Available from:

[4] Crystal Phend. Avastin No Help in Advanced NSCLC [Internet]. MedPage Today. 04/18/12. Available from:

[5] Margaret A. Hamburg. FDA Commissioner announces Avastin decision. US Food and Drug Administration [Internet]. 11/18/11. Available from:

[6] Dienstmann et al. Benefit-risk assessment of bevacizumab in the treatment of breast cancer. Drug Saf (2012) vol. 35 (1) pp. 15-25. Available from:

[7] Temel et al. Early palliative care for patients with metastatic non-small-cell lung cancer. N Engl J Med (2010) vol. 363 (8) pp. 733-42. Available from:

[8] Group et al. The genomic and transcriptomic architecture of 2,000 breast tumours reveals novel subgroups. Nature (2012) pp. 1-7. Available from:

[9] David Batty. Breast Cancer Treatment Gets Boost [Internet]. The Guardian. 04/18/12. Available from:

[10] Ben Berkowitz. Warren Buffett has prostate cancer, sees no danger [Internet]. Reuters. 04/18/12. Available from:

[11] Ben Berkowitz. Berkshire shares dip on Buffett cancer news [Internet]. Reuters. 04/18/12. Available from:

[12] Tara Parker-Pope. Why Was Warren Buffett Screened for Prostate Cancer [Internet]? 04/18/12. Available from: