Over the past few years, deep learning algorithms have proven to be highly successful in solving complex cognitive tasks such as controlling self-driving cars and language understanding. At the heart of these algorithms are artificial neural networks — mathematical models of the neurons and synapses of the brain — that are fed huge amounts of data so the synaptic strengths are autonomously adjusted to learn the intrinsic features and hidden correlations in these data streams.

PCM devices store information in their resistance/conductance states and exhibit conductivity modulation based on the programming history. (IBM)

The implementation of these brain-inspired algorithms on conventional computers is highly inefficient, consuming huge amounts of power and time. This has prompted engineers to search for new materials and devices to build special-purpose computers that can incorporate the algorithms. Nanoscale memristive devices — electrical components whose conductivity depends approximately on prior signaling activity — can be used to represent the synaptic strength between the neurons in artificial neural networks.

While memristive devices could potentially lead to faster and more power-efficient computing systems, they are also plagued by several reliability issues that are common to nanoscale devices. Their efficiency stems from their ability to be programmed in an analog manner to store multiple bits of information; however, their electrical conductivities vary in a non-deterministic and nonlinear fashion.

Researchers have demonstrated a novel synaptic architecture that may lead to a new class of information processing systems inspired by the brain. The device design is based on phase change memory (PCM), an emerging non-volatile memory technology. An electric pulse is applied to the material, which changes the conductance of the device though its physical properties.

The multi-memristive synaptic architecture enables researchers to increase the synaptic precision without increasing the power density, even though several memristive devices are used to represent one synapse. A selection mechanism, based on a global counter, tells the device that it needs to change and when. The only penalty or cost is the requirement for more space for the additional PCM devices.

To test the architecture, both spiking and a non-spiking neural network were trained. The task was handwritten digit recognition; essentially, the network had to recognize what number was appearing from the handwritten images. In both cases, the multi-memristive synapse significantly outperformed the conventional differential architectures with two devices, illustrating the effectiveness of the proposed architecture. An experimental demonstration of the multi-memristive synaptic architecture in a spiking neural network used more than 1 million PCM devices.

While there have been significant successes in the past decade in using machine learning algorithms for a wide variety of complex cognitive tasks, their use in mobile devices and sensors embedded in the real world requires new technological solutions with substantially lower energy and higher efficiency. The architecture is applicable to a wide range of neural networks and memristive technologies and is crossbar-compatible. The proposed architecture and its experimental demonstration are a significant step towards the realization of highly efficient, large-scale neural networks that are based on memristive devices.

For more information, contact Tanya Klein at This email address is being protected from spambots. You need JavaScript enabled to view it.; 973-596-3433.