On the Parsimony of the Multi-Layer Perceptrons when Processing Encoded Symbolic Variables

    loading  Checking for direct PDF access through Ovid


This article addresses the issue of symbolic processing with Multi-Layer Perceptrons through encoding. Given an encoding, we propose a lower bound of the number of parameters for an MLP to perform a random mapping of its input symbolic space to its output symbolic space. In the case of what we call binary encoding, the needed number of parameters may be theoretically computed. Given these two results, we show that the most efficient encodings are the ones which use one input unit per value.

Related Topics

    loading  Loading Related Articles