This article considers the cost dependent construction of linear and piecewise linear classifiers. Classical learning algorithms from the fields of artificial neural networks and machine learning consider either no costs at all or allow only costs that depend on the classes of the examples that are used for learning. In contrast to class dependent costs, we consider costs that are example, i.e. feature and class dependent. We present a cost sensitive extension of a modified version of the well-known perceptron algorithm that can also be applied in cases, where the classes are linearly non-separable. We also present an extended version of the hybrid learning algorithm DIPOL, that can be applied in the case of linear non-separability, multi-modal class distributions, and multi-class learning problems. We show that the consideration of example dependent costs is a true extension of class dependent costs. The approach is general and can be extended to other neural network architectures like multi-layer perceptrons and radial basis function networks.