The article develops the principles of optimal control theory for probability distributions on the configuration space of a controlled dynamical system. Necessary conditions of optimality are derived in the form of the Pontryagin principle for various classes of problems. Analytical representations in the space of distributions and observables are considered. Special modes are examined. The relationship between distribution control theory and nonlocal optimization problems is demonstrated.