In a breakthrough for energy-efficient computing, UC Berkeley engineers have shown for the first time that magnetic chips can actually operate at the lowest fundamental energy dissipation theoretically possible under the laws of thermodynamics. This means that dramatic reductions in power consumption are possible — down to as little as one-millionth the amount of energy per operation used by transistors in modern computers.
This is critical for mobile devices, which demand powerful processors that can run for a day or more on small, lightweight batteries. On a larger, industrial scale, as computing increasingly moves into “the cloud,” the electricity demands of the giant cloud data centers are multiplying, collectively taking an increasing share of the country’s — and world’s — electrical grid.
Lowering energy use is a relatively recent shift in focus in chip manufacturing after decades of emphasis on packing greater numbers of increasingly tiny and faster transistors onto chips.
“Making transistors go faster was requiring too much energy,” said Jeffrey Bokor, a UC Berkeley professor of electrical engineering and computer sciences and a faculty scientist at the Lawrence Berkeley National Laboratory. “The chips were getting so hot they’d just melt.”
Researchers have been turning to alternatives to conventional transistors, which currently rely upon the movement of electrons to switch between 0s and 1s. Partly because of electrical resistance, it takes a fair amount of energy to ensure that the signal between the two states is clear and reliably distinguishable, and this results in excess heat. Magnetic computing emerged as a promising candidate because the magnetic bits can be differentiated by direction, and it takes just as much energy to get the magnet to point left as it does to point right.
Bokor teamed up with UC Berkeley postdoctoral researcher Jeongmin Hong, UC Berkeley graduate student Brian Lambson and Scott Dhuey at the Berkeley Lab’s Molecular Foundry, where the nanomagnets used in the study were fabricated. They experimentally tested and confirmed the Landauer limit, named after IBM Research Lab’s Rolf Landauer, who in 1961 found that in any computer, each single bit operation must expend an absolute minimum amount of energy. Landauer’s discovery is based on the second law of thermodynamics, which states that as any physical system is transformed, going from a state of higher concentration to lower concentration, it gets increasingly disordered. That loss of order is called entropy, and it comes off as waste heat. Landauer developed a formula to calculate this lowest limit of energy required for a computer operation. The result depends on the temperature of the computer; at room temperature, the limit amounts to about 3 zeptojoules, or one-hundredth the energy given up by a single atom when it emits one photon of light.
The UC Berkeley team used an innovative technique to measure the tiny amount of energy dissipation that resulted when they flipped a nanomagnetic bit. The researchers used a laser probe to carefully follow the direction that the magnet was pointing as an external magnetic field was used to rotate the magnet from “up” to “down” or vice versa. They determined that it only took 15 millielectron volts of energy – the equivalent of 3 zeptojoules – to flip a magnetic bit at room temperature, effectively demonstrating the Landauer limit.
This is the first time that a practical memory bit could be manipulated and observed under conditions that would allow the Landauer limit to be reached, the researchers said. Bokor and his team published a paper in 2011 that said this could theoretically be done, but it had not been demonstrated until now.