Cascade error projection (CEP) is an improved learning algorithm for artificial neural networks. CEP is reliable and suitable for efficient implementation in very-large-scale integrated (VLSI) circuitry. In comparison with other neural-network-learning algorithms, CEP involves fewer iterations and is more tolerant of low resolution in the quantization of synaptic weights; thus, CEP learns relatively quickly and the circuitry needed to implement it is relatively simple.

CEP incorporates a cascading-architecture feature (see figure) of a prior algorithm called "cascade correlation." CEP also incorporates an independent-learning-neural-layer feature from cascade back-propagation.
In addition, CEP is built on a firm theoretical foundation that involves mathematical modeling of the learning process in terms of the abstract space of synaptic-connection weights. The "projection" aspect of CEP denotes an approach in which an error surface is projected onto the current hidden learning neuron and its synapses. The firm theoretical foundation is provided by a theorem that says, in essence, that as the learning hidden neural units are incorporated into the neural network sequentially in cascade, the resulting cascade of sequential subspaces ensures that the neural network converges on its learning objective.
This work was done by Tuan A. Duong of Caltech for NASA's Jet Propulsion Laboratory. For further information, access the Technical Support Package (TSP) free on-line at www.nasatech.com/tsp under the Information Sciences category.
NPO-19644
This Brief includes a Technical Support Package (TSP).

Cascade Error-Projection Learning in Nueral Networks
(reference NPO-19644) is currently available for download from the TSP library.
Don't have an account?
Overview
The document presents the Cascade Error Projection (CEP) algorithm, a novel learning technique for artificial neural networks developed by Tuan A. Duong at the Jet Propulsion Laboratory (JPL). The CEP algorithm addresses the challenges of real-time pattern recognition, classification, vision, and speech recognition, which often require non-linear techniques due to their complexity. Traditional linear methods are insufficient for these tasks, making neural networks a more suitable approach.
The CEP algorithm is characterized by its fast and simple learning process. It utilizes a single-layer perceptron for obtaining one set of weights, while a second set is derived through calculations, allowing for efficient learning. This dual approach simplifies the implementation in hardware, as it requires only a single perceptron learning mechanism that can be reused for each set of weights. Additionally, the algorithm incorporates a global calculation block to generate weights for newly added hidden units, enhancing its adaptability.
One of the key advantages of the CEP algorithm is its cost-effectiveness in hardware implementation. It operates with low bit weight quantization (3-4 bits), which is particularly beneficial for learning networks that rely heavily on synaptic weights. The algorithm is also designed to be reliable, with theoretical backing that ensures convergence according to Liapunov's criteria.
The document emphasizes the importance of hardware implementation for achieving the speed advantages of neural networks. It notes that many existing neuromorphic learning paradigms are primarily based on supervised learning techniques, with Error Backpropagation (EBP) being one of the most popular. However, the CEP algorithm offers a more efficient alternative, particularly in terms of hardware requirements and learning speed.
In summary, the CEP algorithm represents a significant advancement in neural network learning techniques, combining efficiency, simplicity, and cost-effectiveness. Its design is particularly suited for real-time applications in complex problem domains, making it a valuable contribution to the field of artificial intelligence and machine learning. The work was conducted under the auspices of NASA, highlighting its relevance to aerospace and other high-tech industries.

