An analog very-large-scale integrated (VLSI) circuit was designed and built to implement Hebbian synapses with an improved method of modifying and storing the synaptic weights, for use in neural-network circuits. (In Hebbian synapses, the synaptic weights are modified through Hebbian learning, which is a local unsupervised adjustment of the weight depending on the correlation of activity between pre- and post-synaptic neurons.) These circuits are intended, more specifically, for use with neural networks of the type that operate with spiking (as distinguished from steady) input and output signals.

A Hebbian Synapse for spiking neurons is illustrated.

The development of these circuits was prompted by a need to store and adjust on-chip synaptic weights using local Hebbian learning rules. The synaptic weights must be stored in the form of analog voltages (charges on capacitive nodes). Such storage is problematic because the charges tend to decay by leakage through reverse-biased active/well/substrate junctions. The designs of the present circuits reduce the leakage currents to about one-sixth of those conventional synaptic-weight-storage circuits, thereby making it possible to store the synaptic weights for correspondingly longer times.

A circuit of the present type includes an analog-weight-storage subcircuit (as depicted within the gray box in the figure) in which the transistor that passes charge on to the capacitive charge-storage node resides in a floating well. The floating well is driven by a voltage follower (VF in the figure) from the storage node, thereby shielding the storage node by reducing the leakage current to the well and enabling the node to hold the charge longer than an ordinary switched capacitor could. The voltage across the active/well junction is held to within the offset of the follower; this typically results in a substantial decrease in leakage current from a normal well that is held at the supply voltage of the well transistor.

The charge-storage subcircuit is incorporated within a larger circuit that acts as a Hebbian synapse; that is, it takes pre- and post-synaptic spike signals as inputs, and increases the synaptic weight if the spikes occur simultaneously. The Hebbian-synapse circuit (see figure) contains two additional transconductance amplifiers (TAs): TA1 controls the learning rate by adjusting the current injected into the storage node, and TA2 converts the stored analog voltage value to a current value. The output of amplifier TA2 is gated by the pre-synaptic input spikes, and hence the final output consists of current pulses that are proportional to the stored voltage and injected into a post-synaptic neuron whenever the pre-synaptic neuron fires a spike. These output current pulses can be summed and integrated with the currents from other synapses in parallel and used to drive the spiking of the post-synaptic neuron. Hebbian learning is achieved by gating charge onto or off the storage node when the pre- and post-synaptic spikes are simultaneous or nonsimultaneous, respectively.

This work was done by Christopher Assad and David Kewley of Caltech for NASA's Jet Propulsion Laboratory. For further information, access the Technical Support Package (TSP) free on-line at  under the Electronics & Computers category.


This Brief includes a Technical Support Package (TSP).
Analog VLSI Circuits for Hebbian Learning in Nueral Networks

(reference NPO-20965) is currently available for download from the TSP library.

Don't have an account? Sign up here.