Classic deterministic genetic models of the evolution of selfing predict species should be either completely outcrossing or completely selfing. However, even species considered high selfers outcross to a small degree (e.g. Arabidopsis thaliana and Caenorhabditis elegans). This discrepancy between theory and data may exist because the classic models ignore the effects of drift interacting with selection, that is, Hill–Robertson effects. High selfing rates make the effective rate of recombination near zero, which is expected to cause the build-up of negative disequilibria in finite populations. Despite the transmission advantage associated with complete selfing, low levels of outcrossing may be favoured because of the benefits of increasing the effective rate of recombination to dissipate negative disequilibria. Using multilocus simulations, we confirm that selfing reduces effective population size through background selection and causes negative disequilibria between selected sites. Consequently, the rate of adaptation is substantially reduced in strong selfers. When selfing rate is allowed to evolve, populations evolve to be either strong outcrossers or strong selfers, depending on the parameter values. Amongst selfers, low, but nonzero, levels of outcrossing can be maintained by selection even when all mutations are deleterious; more outcrossing is maintained with higher rates of deleterious mutation. The addition of beneficial mutations can (i) lead to a quantitative increase in the degree of outcrossing amongst stronger selfers but (ii) may cause outcrossing species to evolve into stronger selfers.