Cascade error projection (CEP) is an improved learning algorithm for artificial neural networks. CEP is reliable and suitable for efficient implementation in very-large-scale integrated (VLSI) circuitry. In comparison with other neural-network-learning algorithms, CEP involves fewer iterations and is more tolerant of low resolution in the quantization of synaptic weights; thus, CEP learns relatively quickly and the circuitry needed to implement it is relatively simple.
CEP incorporates a cascading-architecture feature (see figure) of a prior algorithm called "cascade correlation." CEP also incorporates an independent-learning-neural-layer feature from cascade back-propagation.
In addition, CEP is built on a firm theoretical foundation that involves mathematical modeling of the learning process in terms of the abstract space of synaptic-connection weights. The "projection" aspect of CEP denotes an approach in which an error surface is projected onto the current hidden learning neuron and its synapses. The firm theoretical foundation is provided by a theorem that says, in essence, that as the learning hidden neural units are incorporated into the neural network sequentially in cascade, the resulting cascade of sequential subspaces ensures that the neural network converges on its learning objective.
This work was done by Tuan A. Duong of Caltech for NASA's Jet Propulsion Laboratory. For further information, access the Technical Support Package (TSP) free on-line at www.nasatech.com/tsp under the Information Sciences category.
This Brief includes a Technical Support Package (TSP).
Cascade Error-Projection Learning in Nueral Networks
(reference NPO-19644) is currently available for download from the TSP library.
Don't have an account? Sign up here.