UCLA engineers have designed a new class of material that can learn behaviors over time and develop a “muscle memory” of its own — allowing for real-time adaptation to changing external forces. The material is composed of a structural system made up of tunable beams that can alter its shape and behaviors.

“This research introduces and demonstrates an artificial intelligent material that can learn to exhibit the desired behaviors and properties upon increased exposure to ambient conditions,” said Mechanical and Aerospace Engineering Professor Jonathan Hopkins, UCLA Samueli School of Engineering. “The same foundational principles that are used in machine learning are used to give this material its smart and adaptive properties.”

When the material is placed in aircraft wings, it could learn to morph the shape of the wings based on wind patterns during flight for greater efficiency and maneuverability. Structures infused with this material also could adjust in situations such as an earthquake or other disasters.

The team utilized and adapted concepts from existing artificial neural networks (ANNs) — the algorithms that drive machine learning — to develop the mechanical equivalents of ANN components in an interconnected system. The mechanical neural network (MNN) consists of individually tunable beams oriented in a triangular lattice pattern. Each beam features a voice coil, strain gauges, and flexures that enable the beam to change its length, adapt to its changing environment in real time, and interact with other beams in the system.

The voice coil initiates the fine-tuned compression, or expansion, in response to new forces placed on the beam; the strain gauge is responsible for collecting data from the beam’s motion used in the algorithm to control the learning behavior; and the flexures act as flexible joints to connect the system.

An optimization algorithm then regulates the entire system by extracting data from each of the strain gauges and determining a combination of rigidity values to control how the network should adapt to applied forces.

Early system prototypes exhibited a lag between the input of the applied force and the output of the MNN response, which affected the system’s overall performance. The team tested multiple iterations of the strain gauges and flexures in the beams and different lattice patterns/thicknesses before achieving their published design that remedied this.

The system now is about the size of a microwave oven, but the team plans to simplify the MNN design so that thousands of the networks can be manufactured on the micro scale within 3D lattices for practical material applications.

Other potential uses for MNNs include armor to deflect shockwaves and in acoustic imaging technologies to harness soundwaves.

Photo of an MNN. (Image: Flexible Research Group at UCLA)

Here is a Tech Briefs interview with Ryan Lee, Doctoral Graduate Student, Mechanical and Aerospace Engineering, who works in Hopkins’ UCLA lab.

Tech Briefs: What inspired the research?

Lee: This research was inspired by the structure of a neural network. We noticed [how] future connected weights in the neural network [work] and we wondered if a mechanical material would be made to behave like a neural network using similar mechanical connections.

Tech Briefs: What were the biggest technical challenges you faced?

Lee: The biggest technical challenge that we faced while designing this neural network material was determining an efficient method for controlling the variable stiffness elements.

Tech Briefs: Can you explain in simple terms how the technology works?

Lee: Our MNN consists of a lattice of beams with individually tunable stiffness in a triangular pattern.

We start by selecting how we want the material to deform for a given loading force. Once these deformations are defined, we apply the loads to the network. Then, in an iterative process, we adjust each connection’s stiffness until all the desired behaviors are achieved.

Tech Briefs: What’s the next step with regards to your research/testing?

Lee: The next step in our research will be to find ways for increasing the speed at which we can train our neural network and decreasing the size of each element. Research into training efficiency and scalability will allow us to create larger, more computationally dense MNNs.

Tech Briefs: How far away are we from the technology becoming widely available?

Lee: The biggest obstacles to our technology becoming commonplace are the fabrication challenges and the optimization method challenges that we are currently working on.