Cosmogenic nuclides in rock, soil, and sediment are routinely used to measure denudation rates of catchments and hillslopes. Although it has been shown that these measurements are prone to biases due to chemical erosion in regolith, most studies of cosmogenic nuclides have ignored this potential source of error. Here we quantify the extent to which overlooking effects of chemical erosion introduces bias in interpreting denudation rates from cosmogenic nuclides. We consider two end-member effects: one due to weathering near the surface and the other due to weathering at depth. Near the surface, chemical erosion influences nuclide concentrations in host minerals by enriching (or depleting) them relative to other more (or less) soluble minerals. This increases (or decreases) their residence times relative to the regolith as a whole. At depth, where minerals are shielded from cosmic radiation, chemical erosion causes denudation without influencing cosmogenic nuclide buildup. If this effect is ignored, denudation rates inferred from cosmogenic nuclides will be too low. We derive a general expression, termed the ‘chemical erosion factor’, or CEF, which corrects for biases introduced by both deep and near-surface chemical erosion in regolith. The CEF differs from the ‘quartz enrichment factor’ of previous work in that it can also be applied to relatively soluble minerals, such as olivine. Using data from diverse climatic settings, we calculate CEFs ranging from 1.03 to 1.87 for cosmogenic nuclides in quartz. This implies that ignoring chemical erosion can lead to errors of close to 100% in intensely weathered regolith. CEF is strongly correlated with mean annual precipitation across our sites, reflecting climatic influence on chemical weathering. Our results indicate that quantifying CEFs is crucial in cosmogenic nuclide studies of landscapes where chemical erosion accounts for a significant fraction of the overall denudation.