Announcements

  • May 12, 2009

    Two recent imaging studies shed light on how menopausal hormone therapy (MHT) may affect the brains of older women. Previous studies showed that MHT increased the likelihood that older women would have difficulty with thinking skills and memory and develop dementia or cognitive impairment.

    In the first study, researchers with the Women’s Health Initiative Memory Study-MRI (WHIMS-MRI), an ancillary study of the Women’s Health Initiative (WHI) hormone therapy clinical trials, took MRI brain scans of approximately 1,400 women 79 to 89 years of age 1 to 4 years after the WHI hormone trials ended. They found that women who had taken MHT had smaller brain volumes in two brain areas than women who had taken a placebo. Brain volume was lower in the frontal lobe and in the hippocampus, areas involved in thinking and memory skills. Loss of volume in the hippocampus is a risk factor for dementia.

    “Our findings suggest one possible explanation for the increased risk for dementia in older women who had previously taken MHT in the WHIMS,” said lead author Dr. Susan Resnick of the NIA’s Intramural Research Program. “The findings also suggest that hormone therapy in older postmenopausal women has a negative effect on brain structures important in maintaining normal memory functioning. However, this negative effect was most pronounced in women who already may have had some memory problems before using MHT, suggesting that the therapy may have accelerated a neurodegenerative disease process that had already begun.”

    Dr. Resnick emphasized that the women in this study were randomly assigned to MHT later in life than the usual period of treatment around the time of the menopausal transition. It remains unclear whether earlier MHT given only during the period of most intense menopausal symptoms is associated with poorer cognition.

    In the second study, researchers analyzing the MRI scans found that MHT was not linked to an increase in volumes of small vascular lesions in the brain that are often the first sign of cerebrovascular disease. Lead author Dr. Laura Coker of Wake Forest University noted that the negative effects of MHT on cognitive skills may not be related primarily to vascular disease but to neurodegeneration, which is supported by the first study’s findings of brain atrophy. 

    References:

    Resnick, S.M., et al. Postmenopausal hormone therapy and regional brain volumes: the WHIMS-MRI Study. Neurology. 2009 Jan 13. 72(2):135-42.

    Coker, L.H., et al. Postmenopausal hormone therapy and subclinical cerebrovascular disease: the WHIMS-MRI Study. Neurology. 2009 Jan 13. 72(2):125-34.

  • March 15, 2010

    Specific personality characteristics may be important to successful aging, according to researchers who studied a group of adult children of centenarians. Other studies have shown that these personality traits promote good health and minimize damaging characteristics.

    The children of people who lived to 100 years or more are generally a model of healthy aging, with lower mortality and lower prevalence of chronic diseases than other members of their birth cohort. In this study, researchers led by Dr. Thomas T. Perls of the Boston University Medical Center found that such offspring are more extraverted and less neurotic than other members of their birth cohort.

    Using the NEO Five-Factor Inventory, a 60-item, self-report questionnaire, the researchers measured five personality characteristics—neuroticism, extraversion, openness, agreeableness, and conscientiousness—in 246 offspring of centenarians with a mean age of 75 years. Both men and women scored in the low range for neuroticism and in the high range for extraversion.

    The researchers note that the low neuroticism and higher extraversion levels may confer health benefits. For example, people who are lower in neuroticism may be able to manage stressful situations more effectively than those with higher neuroticism levels. Similarly, high extraversion levels have been associated with greater subjective well-being, vitality, and longevity.

    Reference:

    Givens, J.L., et al. Personality traits of centenarian’s offspring. J Am Geriatr Soc. 2009. 57:683–85.

  • March 15, 2010

    Older Americans have better cognitive health but worse overall health than their counterparts in England, two recent studies show. According to researchers, these health gaps could be due to differences in the prevalence of cardiovascular disease, depression, and other health factors, as well as differences in education, wealth, and access to health care.

    Older adults in the United States performed significantly better than their English counterparts on standard tests of cognitive function, shows a study published in BMC Geriatrics. The study—the first comparison of cognitive function in representative samples of older adults in the United States and England—looked at data on 8,299 Americans and 5,276 Britons, drawn from the 2002 waves of the NIA-funded Health and Retirement Study and the English Longitudinal Study of Ageing. All participants were non-Hispanic whites ages 65 and older.

    Overall, the difference in cognitive performance on tests of word recall and orientation was striking. On average, an 85-year-old American performed as well as a 75-year-old Briton. The U.S. advantage was greatest for people ages 85 and older, the age group on both sides of the Atlantic with the lowest scores on a 24-point cognitive scale.

    The difference in cognitive health could be due to several factors, the researchers write. Older adults in the United States are generally wealthier and better educated than those in England. They also reported lower levels of depressive symptoms. Higher levels of wealth and education and lower levels of depression have been associated with reduced risk of cognitive decline.

    In addition, the Americans had better cognitive health despite a higher prevalence of cardiovascular risk factors and disease—traits associated in some studies with poorer cognition. This result may be explained by the fact that the Americans were more likely than the British to take antihypertensive medications, which previous studies have suggested may help prevent cognitive decline.

    Another study, supported in part by the NIA and published in the American Journal of Public Health, found that older Americans had worse health than English and European seniors at all income levels, even though U.S. per-capita medical spending is two to three times higher than in Europe. Researchers compared the overall health status of non-Hispanic white adults ages 50 to 74 in the United States, England, and 10 European countries.

    Drawing on 2004 data from the Health and Retirement Study, the English Longitudinal Study of Ageing, and the Survey of Health, Ageing, and Retirement in Europe, the researchers found that Americans had the highest prevalence of chronic conditions and physical-function limitations. For example, 18 percent of Americans had heart disease, compared with 12 percent of Britons and 11 percent of Europeans. Poor Americans experienced the greatest health disadvantages compared with their overseas peers, but even well-off Americans reported health comparable to that of poorer Europeans.

    No single factor accounted for these disparities, the researchers note. Differences in behavioral risk factors such as smoking, obesity, physical activity, and alcohol consumption played a role, as did the prevalence of chronic disease, survival rates, and different health care systems. For instance, “the American medical system might be more focused on ameliorating the consequences of disease, with relatively less attention given to prevention,” the authors write.

    Together, the two studies point to the need for further international research that compares and explains the demographic, social, and health factors that account for differences in health status between older populations in different countries. These studies might help identify factors that could help improve the cognitive and physical health of growing elderly populations worldwide.

    References:

    Langa, K.M., et al. Cognitive health among older adults in the United States and England. BMC Geriatrics. 2009 June 25. 9:23.

    Avendano, M., et al. Health disadvantage of U.S. adults aged 50 to 74 years: a comparison of the health of rich and poor Americans with that of Europeans. Am J Public Health. 2009. 99(3):540–8.

  • March 15, 2010

    A recent study of long-lived naked mole rats calls into question the conventional theory that aging results from the accumulation of oxidative damage. Naked mole rats can live into their late 20s with steady levels of oxidative damage that begin at a young age. Another biological mechanism—resilient proteins that adapt to oxidative stress—may be the key to their successful aging, according to NIA-funded researchers at the University of Texas Health Science Center in San Antonio.

    The research reveals another exception to the widely accepted theory that oxidative damage to proteins and other molecules leads to aging. In naked mole rats, proteins sustain oxidative damage early on yet remain stable throughout the rats’ long lives, the scientists found. In contrast, proteins in short-lived mice show increasing levels of oxidative damage as they get older.

    In comprehensive testing and analysis of oxidation states of protein cysteines (sulfur-containing amino acids), the researchers found that, compared with mice, mole rats had higher levels of total cysteine and no age-related changes in cysteine oxidation during more than 20 years. The mole rats also showed unusual resistance to protein unfolding and lower levels of protein degradation during aging.

    The results suggest a new biochemical mechanism underlying longevity: the ability of oxidized proteins to maintain their structural stability and integrity.

    Reference: Pérez, V.I., et al. Protein stability and resistance to oxidative stress are determinants of longevity in the longest-living rodent, the naked mole-rat. Proc Natl Acad Sci USA. 2009. 106(9):3059–64.

  • March 15, 2010

    Older adults with limited participation in social activities had a faster decline in motor function than those who had frequent social engagements, according to a report in the Archives of Internal Medicine. The study suggests that more frequent participation in social activities may slow older adults’ motor decline, which can lead to disability and other adverse health outcomes.

    In an NIA-supported study of 906 older adults without stroke, Parkinson’s disease, or dementia, researchers from Rush University Medical Center in Chicago found that each 1-point decrease in a participant’s social activity was associated with about a 33-percent more rapid rate of decline of motor function, a more than 40-percent increased risk of death, and a 65-percent increased risk of new disability. This decrease in social activity was equivalent to being about 5 years older at the start of the study. These associations held up after controlling for demographic and confounding factors such as chronic medical conditions, depression, and joint pain. 

    The researchers measured social activity based on frequency of activities such as going to restaurants and sporting events, attending religious services, traveling, playing bingo, and doing volunteer work. Motor function was measured by a composite score on 18 tests, including walking speed, grip strength, hip flexion, and turning.

    While higher levels of physical activity are known to be associated with a slower rate of decline in motor function, this study and others suggest a similar effect for social activity. “These findings may be particularly relevant for intervention strategies designed for older adults, for whom participation in physical activities may be constrained because of underlying health problems,” the authors conclude. However, more research is needed to confirm that increased social activity causes slower motor decline, not vice versa.

    Reference:

    Buchman, A.S., et al. Association between late-life social activity and motor decline in older adults. Arch Intern Med. 2009 June 22. 169(12):1139–46.

  • March 15, 2010

    The appropriate age at which to end prostate cancer screening is controversial, with different organizations issuing different recommendations. Data from a new study suggest that screening might be safely discontinued in men ages 75 and older who have prostate specific antigen (PSA) levels of less than 3 ng/ml—a cutoff point lower than that proposed in previous studies.

    PSA testing is common in older men despite evidence that those without aggressive prostate cancer are unlikely to benefit from diagnosis and treatment, write researchers from the Johns Hopkins School of Medicine and the NIA’s Intramural Research Program in Baltimore. To help weigh the risks and benefits of PSA testing in this population, they studied 849 men ages 40 and older who participated in the NIA Baltimore Longitudinal Study of Aging. Of the group, 122 had prostate cancer and 727 did not.

    Researchers determined the probability of high-risk prostate cancer developing in 5-year age groups starting at age 60, sorting the data by PSA cutoffs that ranged from less than 1 ng/ml to 3.5 ng/ml or greater. There is no PSA value below which a diagnosis of prostate cancer can be excluded, the authors note.

    Participants of all ages with a PSA of 3 ng/ml or more had an increasing probability of high-risk prostate cancer or death from the disease, the study found. Among those older than age 75, none with a PSA of less than 3 ng/ml died of prostate cancer, and only one got high-risk cancer. Of the older men with a PSA of 3 ng/ml or greater, 10 died of prostate cancer and 18 had high-risk disease.

    Men 75 to 80 years old with a PSA less than 3 ng/ml are unlikely to develop or die of aggressive prostate cancer during their lifetimes, the researchers conclude. This finding suggests that PSA testing might be safely discontinued for these men, avoiding unnecessary treatment.

    Reference: Schaeffer, E.M. et al. Prostate specific antigen testing among the elderly—when to stop? J Urol. 2009 April. 181:1606–14.

  • March 15, 2010

    A high degree of conscientiousness—the tendency to follow societal norms, plan, and be task and goal directed—has been shown to predict better physical health and functioning. In a recent study by researchers at the University of Illinois at Urbana-Champaign, older adults’ conscientiousness also seemed to influence the health status of their spouses, an effect called “compensatory conscientiousness.”

    Another personality trait, neuroticism, is associated with poorer health and physical limitations, the researchers found. Unlike conscientiousness, neuroticism—an enduring tendency to experience negative emotional states, often accompanied by anxiety or stress—did not predict a spouse’s or partner’s health status.

    However, people who scored high in both neuroticism and conscientiousness were healthier than others, perhaps because the heightened sense of concern worked synergistically to result in a higher degree of awareness of a spouse’s health. The wives of men with this combination of personality traits reported better health than other women. A husband who is anxious about his wife’s health could be driven to activities that improve her health, the authors explain. However, this compensatory effect did not appear for husbands of women with high neuroticism and conscientiousness.

    Looking at 2,006 self-reports from 2,203 couples who participated in the NIA-funded Health and Retirement Study of people ages 50 and older, the researchers found that older adults’ conscientiousness predicted their spouses’ health outcomes above and beyond the spouses’ own personality. “These results suggest that a conscientious partner is beneficial to an individual’s health no matter how conscientious that individual is,” they conclude. “Partners high in conscientiousness might be more reliable and consistent providers of support and might be a source of more constructive advice and feedback about health-related issues.”

    Reference:

    Roberts, B.W., et al. Compensatory conscientiousness and health in older couples. Psychol Sci. 2009. 20(5):553–9.

  • March 15, 2010

    A simple dietary intervention can reduce the risk and severity of chronic diseases and improve health, regardless of a person’s age or current health status, report researchers from the NIA and the Mount Sinai School of Medicine in New York. Lowering the intake of heat-processed foods, including pasteurized, dried, smoked, fried, or grilled foods, decreases the level of toxins called Advanced Glycation End products (AGEs). These products are believed to increase oxidative stress and inflammation in the body.

    The researchers randomly assigned 40 healthy adults in two age groups (18–45 and more than 60 years old) and 9 patients with kidney disease to groups consuming either a low-AGE diet (30–50 percent reduction) or a normal diet. Researchers examined the relationships between age, level of dietary AGE, serum AGE, peripheral mononuclear cell AGER1 (an antioxidant receptor involved in AGE metabolism), and oxidative stress/inflammation before and after reduction of dietary AGE intake and compared with no reduction of AGE.

    AGE toxins reduce levels of AGER1, the scientists found. In addition, levels of AGER1 correlated significantly with levels of circulating AGE and markers of oxidative stress. The researchers concluded that reducing AGE in diets might lower oxidant stress and inflammation and restore levels of AGER1 in both healthy subjects and those with chronic disease, regardless of age.

    Reference: Vlassara, H., et al. Protection against loss of innate defenses in adulthood by low intake: Role of the antiinflammatory AGE Receptor-1. J Clin Endocrinol Metab. 2009. 94(11):4483–91.

  • March 15, 2010

    In humans, genetic mutations usually lead to diseases or increased risk of them. But in the C. elegans roundworm, an induced mutation enabled a longer life span through the surprising transformation of “mortal” somatic cells into “immortal” germline cells. The study, funded in part by the NIA, was published recently in Nature.

    Somatic cells, which are involved in an animal’s growth, metabolism, and behavior, have a limited life span. In contrast, germline cells continue from one generation to the next. In this study, researchers found that certain genetic mutations known to extend the lifespan of C. elegans induced somatic cells to express two genes that are normally active only in reproductive germline cells.

    Normally, germline cells live longer than somatic cells, in part because they are more stable and better able to resist damaging stress. Here, the transformed somatic cells enabled the worms to live much longer than usual. The key was increased resistance to genotoxic stress, the result of new, protective insulin-like signaling pathways that preserved genomic stability, as well as enhanced RNA interference (RNAi). Conversely, the researchers found that inactivating germline-expressed genes in the mutant worms damaged DNA and shortened their normally long life span. “Taken together, these data suggest that soma-to-germline transformation, enhanced RNAi, and longevity share common regulatory mechanisms,” the researchers write.

    The study, led by Dr. Gary Ruvkun of Massachusetts General Hospital in Boston, aids the understanding of molecular pathways involved in the genetic stability and longevity of mammals. It also may help scientists develop new ways to repair and even regenerate cells and tissues, which could lead to therapies that protect against age-related decline.

    Reference: Curran, S.P., et al. A soma-to-germline transformation in long-lived Caenorhabditis elegans mutants. Nature. 2009 June 25. 549:1079–84.

  • July 5, 2010

    Dr. Robert Butler

    Dr. Robert N. Butler, NIA’s founding director, died July 4, 2010. He was 83.

    Dr. Butler leaves behind an unparalleled professional and personal legacy. There was no greater champion for older people and for the research and policies that could improve their lives. Dr. Butler came to the NIA on May 1, 1976. Two days later, he was awarded a Pulitzer Prize for his passionate presentation of what he termed “ageism,” and which he worked to eradicate throughout his life. At NIA, he set in place a visionary research endeavor, building a rationale and organization for a broad program of basic, biomedical, social, and behavioral research that remains at the core of our efforts today. A geriatric psychiatrist, Dr. Butler was particularly proud of focusing public and research attention to Alzheimer’s disease and dementias.

    He left NIA in 1982 to direct a new Department of Geriatrics at Mt. Sinai School of Medicine in New York City and continued his advocacy for older people and for the study of aging in founding the International Longevity Center USA.

    “Bob Butler was a pioneer who sought to redefine aging, for both individuals and society,” said NIA Director Dr. Richard J. Hodes. “He challenged the status quo, looking at what can be achieved in later life, not at what might be lost. The field of aging research—and anyone seeking a better life with age—has lost a best friend.”

Pages