Immortality: Trust us, you wouldn’t like it.
It’s a comforting message, in a sour-grapes sort of way. It sounds wise and mature, suggesting that we put aside childish dreams and accept once and for all that there can be no vital Veg-O-Matic that slices mortality and dices infirmity. Gerontologists like it, being particularly eager to put on a respectable front and escape the whiff of snake oil that clings to the field of life extension.
In 1946 the newly founded Gerontological Society of America cited, in the first article of the first issue of its Journal of Gerontology, the need to concern ourselves to add “not more years to life, but more life to years.” The dictum was famously sharpened 15 years later by Robert Kennedy when he told the delegates at the first White House Conference on Aging “We have added years to life; it is time to think about how we add life to years.” Political theorist and futurist Francis Fukuyama was particularly eloquent but hardly alone when he warned two decades ago that if we maintain our obsession with extending life at all costs, society may “increasingly come to resemble a giant nursing home.”
Around the same time noted aging researchers S. Jay Olshansky and Bruce Carnes wrote in ominous tones that we were treading into the realm of “manufactured survival time,” warning that “this success has been accompanied by a rise in frailty and disability in the general population.1 This is a consequence that neither the medical community nor society was prepared for.” A celebrated article by epidemiologist E.M. Gruenberg in 1977 bemoaned the “failures of success”: “at the same time that persons suffering from chronic diseases are getting an extension of life, they are also getting an extension of disease and disability.”
This message is particularly dire if lifespans rise over extended periods of time—which they have done. In 1936 Louis Dublin, the chief actuary of Metropolitan Life teamed up with the esteemed mathematical demographer Alfred Lotka, to calculate the maximum life expectancy theoretically possible. They came up with a limit of 69.93 years. This limit was exceeded by women in Iceland five years later, by American women in 1949, and by American men in 1979. Life expectancies have been increasing at a steady rate of 3 months per year for the past 175 years, and on average, expert calculations of the maximum possible human lifespan have been exceeded an average of five years after being made. In some cases, they had already been overtaken by events somewhere in the world at the time they were issued.
But what if long lifespans don’t necessarily mean more years of disability? At the turn of the present century George C. Williams, celebrated evolutionary theorist of aging, attacked what he termed the “Tithonus error.” Tithonus, son of a nymph, lover of a goddess, was granted the boon of eternal life. But the further gift of eternal youth was unattainable. Frail, bent, and suffering he shriveled at last into a cricket. Williams’ argument was almost a trivial one, from the perspective of evolutionary biology: The very aged are rare, hence there is unlikely to have been any evolutionary pressure to shape the timing of the end of life, in the way that the timing of early development has been shaped. What we see as the “natural lifespan” is simply a balance between the wear of daily life and the limited ability of repair mechanisms to undo it fully. Shifting the balance, either by increasing the rate or efficiency of repair, or by reducing the rate of damage, must surely stretch out the whole process. Actually, it should do even better than that: The end stage, where most of our suffering is found, ought to be the least susceptible to extension, since it requires maintaining the function of an organism that is failing on multiple levels. This is consistent with the observation that, while mortality rates have been falling at all ages, the pace of progress has been slowest at advanced ages. Youth, according to this argument, should take up a greater portion of our lifespan over time. In 1980 the medical researcher James Fries called this process “compression of morbidity.”
Americans in 2010 could expect to live 80 percent of their lives without major disability, including well over half of their years after age 65.
The morbidity-compression concept—simultaneously a description, a prediction, and a target for future health policy—received a tremendous boost in the mid 1980s when the economic historian Robert Fogel discovered that a tremendous amount of it had already taken place without anyone having noticed. When he started examining the medical records of Union Army Civil War volunteers and veterans he found a staggering level of what we would now consider to be age-related degenerative disease—arthritis, heart disease, cancer—at ages when it would now be exceedingly rare. Of the recruits under age 20, nearly all volunteers, who were examined for service in the Union Army in 1861, 16 percent were rejected on medical grounds. This rose to nearly a quarter of those aged 20 to 25, and nearly half of those in their 30s.
By comparison, the generation of American men who fought in World War II and are now in their 90s lived, on average, about eight years longer than their great grandfathers who fought in the Civil War, once they reached adulthood. Those among the 19th-century men who did survive into what we now call middle age also spent more of their years suffering chronic, debilitating illnesses. Specifically, the average age of onset of arthritis was 64.7 years for the WWII veterans, but only 53.7 years for the Civil War veterans. Heart disease started nearly 10 years later, and chronic respiratory disease more than 11 years.
Comparing the Union Army results with late 20th-century health surveys has led to estimates of disability-rate declines of 0.6 percent per year, accelerating to 1.7 percent per year in the 1980s and 1990s.2 These trends have continued into the 21st century, at least according to some measures. According to the 1985 United States Health Interview Survey 23 percent of Americans aged 50 to 64 reported limitations on daily activities due to chronic illness; in 2014, this was down to 16 percent. At ages 65 and higher, the percentages were 39 percent and 33 percent.
A recent study by University of Southern California gerontologist Eileen Crimmins and her colleagues looked at the change in disability-free life expectancy—the average number of years that we would expect someone to live free of major limitations due to long-term illness. From 1970 to 2010 American males gained about 7.7 years of life expectancy at birth, of which nearly half (3.2) could be expected to be disability free. Perhaps more immediately relevant, Americans aged 65 saw their remaining life expectancy increase from 15 to 19 years, with 2.5 of the 4 extra years being disability-free. (This averages the results for men and women; women gained fewer years overall than men, but the relative gains between disability-free and disabled years are similar.) The largest increase in healthy years after age 65 came in the last decade. Americans in 2010 could expect to live 80 percent of their lives without major disability, including well over half of their years after age 65.
Perhaps most striking, a new study has discovered that over the past two decades the incidence of new dementia cases has dropped by 20 percent.3 Men in the United Kingdom develop dementia today at the same rate as men five years younger in the 1990s; for women the improvements have been more modest.
Imagine, now, that the trend of the last century continues another hundred years: Our 50-year-old great-grandchildren may have an average of 50 years left to live, the same span as a 30-year-old today can expect. It is not implausible that they will be similarly spry and untouched by disability. Will they really think of themselves as young, in the same way that a 30-year-old today does? Will youth extended still be youth? It is not as absurd as it may seem. When the U.S. National Institutes of Health first founded its Aging Unit—later the National Institute on Aging—in 1940, it announced the goal of research in “the problems of aging,” particularly “the period between 40 and 60 years of age.”4
On the other hand, the story of morbidity compression could be about to change. That’s because the big drivers of compression have already acted. Medical technology will continue to advance, but, for all its marvels, it has played a smaller role in compression than basic improvements in nutrition and hygiene. It turns out that much of what might have been considered normal age-related decline is strongly accelerated by disease and malnutrition early in life, even before birth. Babies of malnourished mothers, even those who received adequate nutrition after birth, are found decades later to have substantially elevated incidences of coronary heart disease, stroke, and hypertension. Survivors of childhood smallpox and whooping cough have generally higher levels of mortality in old age.
It seems clear that the wealthy part of the world has achieved the peak of benefits to be gained from increased nutrition and basic hygiene—if we have not actually gone far past on both scores. The nexus between improved nutrition and improved adult and late-life health was marked, in the past, by increasing height: Final adult height summarizes the whole record of childhood health nutrition, and the past two centuries of increased lifespans consistently tracked increases in average height. These increases have plateaued, by and large, in Japan and Western Europe, though some laggards—in particular the U.S., which used to have one of the tallest populations in the world—clearly have some space to catch up.
The massive benefits from vaccination, the elimination of leaded gasoline, and reduction of smoking are still making their way through the aging population, of course, and will likely be stretching our healthy lifespans for some time to come. Tremendous progress could still be achieved by spreading the healthful environments of wealthy countries to the rest of the world, and the healthful lifestyles of the wealthy within those countries to the rest of the population.
But beyond these effects, and especially for Western countries, morbidity compression will not be what it once was. Perhaps the most optimistic scenario for the near future of healthy aging may be what demographer Kenneth Manton has called “dynamic equilibrium.” Manton suggested that disease would not be prevented or delayed, but managed and arrested at an early stage, so that an increasing portion of the population would be living with mild disease, while fewer would suffer severe disability.
An illustration of this may be found in a recent study of the timing of disability by health economists David Cutler, Kaushik Ghosh, and Mary Beth Landrum. By their measure, expected disability-free years after age 65 expanded from 8.8 to 10.4 and disabled years actually contracted, from 8.5 to 7.8, between 1992 and 2004—with the disabled years increasingly concentrated in the period immediately preceding death. The picture looked slightly different, though, when framed in terms of “disease-free” years. Disease-free life expectancy after age 65 barely increased, from 8.0 to 8.6, while years with disease increased from 9.5 to 9.7. More disease, then, but less disability.
Diabetes was transformed early in the 20th century from a killer to a manageable chronic disease; AIDS went this way in the early 21st century. The biggest killer of all in the developed world, cardiovascular disease, may also be making this transition. Eileen Crimmins, who has generally been cautious about claims of morbidity compression, has pointed out that most of the decline in CVD mortality in the 1950s and 1960s came from disease-preventing lifestyle changes, while the hardly less remarkable reductions over the last few decades—up to 70 percent reductions in mortality in western Europe and the U.S.—were largely due to medical treatment and disease management.5
Will the fit 90-year-olds of the future need to expend the strength they have maintained to lug around the contents of a large medicine cabinet to keep them going? A recent study found that while the fraction of elderly Americans (age 65 and over) who were taking five or more prescription medications had been increasing, it seems to have stabilized in recent years at just under 40 percent. Many of the most effective treatments for age-related conditions are set-it and forget-it—hip replacements, for instance, and cardiac pacemakers. Implantable drug pumps are already in use.
We long ago got used to thinking that a person can still be youthful and healthy even when needing spectacles to see clearly, or when their survival depends on an artificial supply of insulin. The spry 90-year-olds of the future may be no different, hearing through cochlear implants and running with leg and heart muscles rebuilt with stem-cell treatments. Whatever the future of aging is, there is no sign yet of any limit to our ability to expand each of the phases of our lives.
David Steinsaltz is an associate professor of statistics at the University of Oxford. He blogs at Common Infirmities.
References
1. Olshansky, S.J., Carnes, B., & Grahn, D. Confronting the boundaries of human longevity. American Scientist 86, 52 (1998).
2. Costa, D.L. Understanding the twentieth-century decline in chronic conditions among older men. Demography 37, 53-72 (2000).
3. Matthews, F.E., et al. A two decade dementia incidence comparison from the Cognitive Function and Ageing Studies I and II. Nature Communications 7 (2016). Retrieved from DOI: 10.1038/ncomms11398
4. Gilleard, C. & Higgs, P. Aging without agency: Theorizing the fourth age. Aging & Mental Health 14, 121-128 (2010).
5. Crimmins, E.M., & Beltrán-Sánchez, H. (2010). Mortality and morbidity trends: Is there compression of morbidity? The Journals of Gerontology: Social Sciences 66B, 75-86 (2011).
The lead photocollage was created with images from Dorling Kindersley / Getty Image and Vvoe / Shutterstock