The high predictive validity of self-rated health (SRH) is a major strength of this widely used population health measure. Recent studies, however, noted that the predictive validity varies across population subgroups. The aim of this study is to examine respondents’ age as a moderator of SRH predictive validity with respect to subsequent mortality risk.Method:
Using data from the National Health Interview Survey–Linked Mortality Files (NHIS-LMF) 1986–2006, we estimate Cox proportional hazard models of all-cause and cause-specific mortality for adults aged 45–84 years as a function of their health ratings (N = 574,008).Results:
The data show significant age moderation of the predictive validity of SRH across all levels of ratings: the hazard ratios for mortality decline by about a half between the ages of 50 and 80 years. This attenuation appears primarily among earlier birth cohorts; there is no significant age attenuation in more recent cohorts—however, this may be in part attributed to the earlier ages when the respondents are observed.Discussion:
The findings of declining predictive validity of SRH across age imply that individuals may evaluate their health differently as they age. The results also suggest caution in using SRH to capture age-related health changes in the older population.