What do the polio vaccine, in vitro fertilization, and gene mapping have in common? Their successes are all due, in varying extents, to the use of HeLa cells. HeLa cells were originally derived from a self-replicating cell line obtained unknowingly 65 years ago from Henrietta Lacks, a woman battling an aggressive form of cervical cancer. Researchers soon realized the potential of these cells, which have been referenced in over 74,000 studies to date. Concern by Ms. Lacks’ descendants erupted in the 1970’s when they were first alerted to the use of their ancestor’s cells for scientific research. However, since then, the family members have largely been ignored. The controversy was brought into the limelight again 5 months ago when the genome sequence for HeLa cells was published online. Ms. Lacks’ descendants were not informed of the project and were worried that information in the genome, such as the risk of Alzheimer’s disease or cancer, might be used in the future to discriminate against them. Working with the National Institute of Health, the Lacks family agreed to have the genome sequence stored in the NIH database where researchers could apply for access to the sequence. The request would be reviewed by a committee, including two members of the Lacks family. The agreement does not include any financial benefit from commercial products which may be developed by the research. The Immortal Life of Henrietta Lacks by Rebecca Skloot chronicles this unprecedented journey .
Meanwhile, a New England Journal article this week examined the effect of various glucose levels on the risk of developing dementia . In a prospective cohort study involving volunteers of the Adult Changes in Thought (ACT) study group, 2067 dementia free community dwelling individuals over the age of 65 in Washington State were assessed every two years with the Cognitive Abilities Screening Instrument. Participants’ measurements of fasting glucose, random glucose, and glycosylated hemoglobin were combined using a hierarchical Bayesian framework to derive a time-varying estimate of the average glucose level for each participant at baseline and subsequent five year intervals. The participants were stratified into two groups based on whether or not they had diabetes (being defined as filling two diabetes related prescriptions per year). Over a median follow up of 6.8 years, dementia developed in 524 of 2067 (25.4%) total participants. In non-diabetics, 450 of 1724 (26.1%) developed dementia and in diabetics 74 of 343 (21.6%) developed dementia. The hazard ratio (HR) for dementia increased with increasing average glucose level in both groups. For an average glucose level of 115 in a patient without diabetes, the HR was 1.18 (95% CI 1.04-1.33, p= 0.01). An average glucose level of 190 in a patient with diabetes correlated with a HR of 1.40 (95% CI 1.12-1.76, p=0.002). These estimates were adjusted for confounding variables including age, sex, race, education, apoE ε4 allele, atrial fibrillation, hypertension, coronary artery disease, cerebrovascular disease, congestive heart failure, exercise, and smoking status. Baseline characteristics tended to favor increasing prevalence of confounders in the diabetic group although p values were not reported .
The limitations of this study included its prospective cohort design and the possibility of unknown confounders as well as the definition of diabetes used (filling two prescriptions a year of diabetes related medications). It is possible that individuals labeled as non-diabetics with higher glucose levels may have had undiagnosed diabetes. However, the authors contend that they found an increased risk for dementia even on the lower end of the glucose spectrum that was associated with rising glucose levels. This study highlights the importance of maintaining glycemic control in our patients with diabetes. However, the observation that higher levels of glucose increase dementia risk in individuals without diabetes underscores the need for further research into the mechanism by which this occurs, including the possibility that there may be an unknown confounder affecting results .
This week in JAMA, a randomized comparative effectiveness trial enrolled 5,994 uninsured men and women aged 54 to 64 to three different approaches for colorectal cancer screening . Fecal immunochemical test (FIT) outreach involved mailing instructions and a free FIT card to the patient; colonoscopy outreach involved mailing an invitation to schedule a no-cost colonoscopy; and usual care involved routine primary care visit based screening. Both FIT and colonoscopy groups received up to two “live” telephone reminders within three weeks of the mailing and were allowed to continue usual care office based screening by their doctors. Screening participation rates for FIT were 40.7% (95% CI 38.3-43.1), colonoscopy 24.6% (95% CI 20.8-28.5), and usual care 12.1% (95% CI 11.1-13.1%), all with P<0.001. FIT and colonoscopy outreach were both superior to usual care with a number needed to treat to accomplish one additional screening over usual care of 8 for colonoscopy and 3.5 for FIT. FIT outreach was superior to colonoscopy outreach with regards to number of participants completing screening. However, 11 of 60 patients or 18% with abnormal FIT results did not subsequently undergo diagnostic colonoscopy. A limitation of the study was that it was conducted over only one year and cannot evaluate whether participants would continue to complete yearly FIT testing. Additionally, perhaps with continued yearly reminders, more patients would complete colonoscopy, making it a more preferred option than FIT, given that colonoscopy is more sensitive than FIT at detecting colorectal cancer. Given the results of this study, we may begin to see a change in public health efforts to screening for colorectal cancer using more mail outreach strategies and FIT technology. A cost benefit analysis is required to determine whether these new methods are feasible on a larger scale .
In Science this week, researchers reported an exciting development in the global health initiative to eliminate malarial infections worldwide . Previous attempts at developing a vaccine have been unsuccessful. In a phase 3 efficacy trial published in 2012, the subunit vaccine RTS,S/AS01 was only able to prevent the development of malaria in 31.3% of African infants over a 12 month period. As malaria infection is responsible for an estimated 0.66 to 1.24 million deaths worldwide, researchers press onward for a solution. Now, in the results of a phase 1 clinical trial, intravenous administration of an irradiated whole sporozoite in the pre-erythrocytic stage in 5 doses resulted in immunity to subsequent challenge with malaria infection, with P=0.015 compared to controls. Control participants were individuals who did not receive the vaccine and were subsequently exposed to malaria infection. Adverse events included transient asymptomatic elevations in alanine aminotransferase and/or aspartate aminotransferase in 40% of vaccine recipients that was not dose dependent. Researchers postulate that the vaccine was successful since it was administered intravenously, which was more effective at eliciting T cell response in peripheral blood, especially in the liver. Future studies are needed to examine the duration of protection, optimal number and spacing of doses, and how to immunologically monitor response to the vaccine . Once perfected, this vaccine would have a tremendous impact on global health worldwide, saving the lives of millions.
In the ongoing struggle to manage outpatient hypertension, both the Joint National Committee on the Prevention, Detection, Evaluation, and Treatment of High Blood Pressure (JNC 7) and the American Heart Association recommend home blood pressure (BP) monitoring in addition to clinic visits for blood pressure evaluation despite a lack of supporting evidence [6, 7]. In the Annals this week, researchers conducted a systematic review and meta-analysis of 52 studies and found that self-monitored BP (SMBP) alone versus usual care (clinic blood pressure monitoring) had moderate strength evidence to lower blood pressure at 6 months with a summary net difference of -3.9mmHg systolic and -2.4mmHg diastolic blood pressure . SMBP with support (defined as counseling, education, web-based, and miscellaneous) versus usual care had high strength evidence to decrease blood pressure for 12 months with systolic BP lowered by -3.4 to -8.9mmHg and diastolic BP lowered by -1.9 to -4.4mmHg. However, whether these effects last beyond 12 months is uncertain. Limitations include a lack of minority representation and patients other than those with uncomplicated hypertension . Although the effects of blood pressure reduction are modest, at a population level this amount of reduction has been estimated to result in mortality reductions of 6% to 14% due to stroke, 4% to 9% due to chronic heart disease, and 3% to 7% due to all causes, making it a worthwhile consideration for individuals motivated to monitor their blood pressure at home [6, 9].
Additional Recommended Articles:
1. Goldfarb DS. A piece of my mind. Cocktail party nephrology. JAMA. 2013 Jun 26;309(24):2561-2. Our own Dr. David Goldfarb reflects in his inimitable way on his personal experiences with nephrolithiasis and how it shaped his practice, including his current role as the director of a kidney stone clinic.
2. Lebwohl, B., et al., Mucosal healing and risk for lymphoproliferative malignancy in celiac disease: a population-based cohort study. Ann Intern Med, 2013. 159(3): p. 169-75. In a population based cohort study, patients with celiac disease and persistent villous atrophy compared with mucosal healing on pathologic examination of intestinal biopsies were at increased risk for lymphoproliferative malignancy. Standardized incidence ratio (SIR), 3.78 (95% CI 2.71 to 5.12) with persistent villous atrophy and SIR, 1.50 (95% CI 0.77 to 2.62] with observed mucosal healing.
3. Vander Lugt, M.T., et al., ST2 as a marker for risk of therapy-resistant graft-versus-host disease and death. N Engl J Med, 2013. 369(6): p. 529-39. Plasma levels of biomarker ST2 (suppression of tumorigenicity 2) in patients with graft versus host disease (GVHD) correlated with risk of developing treatment resistant disease or death after transplantation. Measurement of ST2 prior to treatment for GVHD and six months later could identify these individuals, allowing for closer monitoring and more aggressive therapy.
4. Li, C.I., et al., Use of Antihypertensive Medications and Breast Cancer Risk Among Women Aged 55 to 74 Years. JAMA Intern Med, 2013. In a population based case-control study of women aged 55 to 74 years, exposure to any type of calcium channel blocker for more than ten years was associated with an increased risk of ductal breast cancer (OR 2.4, 95%CI 1.2-4.9, P = .04) and lobular breast cancer (OR 2.6, 95%CI 1.3-5.3, P = .01). There was no appreciated increased risk with other types of antihypertensives including diuretics, beta-blockers, and angiotensin II inhibitors. Further research, including prospective studies, are required to further evaluate this finding before practice changing recommendations can be made.
Theresa Sumberac, MD is a third year internal medicine resident at NYU Langone Medical Center
Peer Reviewed by Matthew Vorsanger, MD, Associate Editor, Clinical Correlations
3. Gupta, S., et al., Comparative Effectiveness of Fecal Immunochemical Test Outreach, Colonoscopy Outreach, and Usual Care for Boosting Colorectal Cancer Screening Among the Underserved: A Randomized Clinical Trial. JAMA Intern Med, 2013.
6. Chobanian, A.V., et al., The Seventh Report of the Joint National Committee on Prevention, Detection, Evaluation, and Treatment of High Blood Pressure: the JNC 7 report. Jama, 2003. 289(19): p. 2560-7
7. Pickering, T.G., et al., Call to action on use and reimbursement for home blood pressure monitoring: Executive Summary. A joint scientific statement from the American Heart Association, American Society of Hypertension, and Preventive Cardiovascular Nurses Association. J Clin Hypertens (Greenwich), 2008. 10(6): p. 467-76.