(Image: phonlamaiphoto/Adobe Stock)

As we celebrate 2025, the International Year of Quantum Science and Technology, I started to think about practical applications of quantum mechanics in computers, sensors, and cryptography. And I find that thinking about these things from an engineer’s point of view is quite challenging.

If Richard Feynman, who won a Nobel Prize for his work on “quantum electrodynamics,” didn’t think anyone really understands quantum mechanics, how can so many people these days be talking about quantum computers being the next big thing? On the other hand, Feynman himself laid out the theoretical underpinnings of quantum computers in 1982, when he said that “to accurately model a quantum system, scientists would need to build another quantum system.” That other system is what we now call a quantum computer.

For example, according to an article on the Tech Briefs website, the field of quantum computing is expected to reach $65 billion by 2030. In another article, we read, “Quantum computers have the potential to solve complex problems in human health, drug discovery, and artificial intelligence millions of times faster than some of the world’s fastest supercomputers.”

To compound my confusion, I learned that  “The answers from quantum computers are drawn from probability distributions. Quantum computers don't give you a specific value for an answer. What they do is tell you how likely it is for a certain value to be the correct solution” — their answers are ‘fuzzy.’ “Unfortunately, running a quantum algorithm only once isn't enough. To get as close as possible to the ‘right’ answer, computer scientists run these calculations multiple times. Each sample reduces uncertainty. The computer may need to run the algorithm thousands of times — or even more — to get as close as possible to the most accurate distribution.” But there’s an upside: “Quantum computers run these algorithms so quickly that they still have the potential to produce results much, much faster than classical ones.”

OK, I can accept the idea of using quantum mechanics without really deeply understanding it, but I have some real difficulty with using answers from a quantum computer that are only probabilities. As an engineer, I am used to looking for fixed, workable solutions to real-world problems, not answers that say, for example, that when I press an elevator button, the elevator will probably come to my floor.

After decades of work as an EE, SAE Media Group’s Ed Brown is well into his second career: Tech Editor.

“I realized, looking back to my engineering days and watching all of the latest and greatest as an editor, I have a lot of thoughts about what’s happening now in light of my engineering experiences, and I’d like to share some of them now.”

One of the best descriptions of quantum computing I came across was by National Institute of Standards and Technology (NIST) physicist Tara Fortier  : “5 Concepts Can Help You Understand Quantum Mechanics and Technology — Without Math!” She explains that although “fuzziness” is an essential feature of quantum computing, that isn’t a fault. “Classical physics governs the movement of things we can see, such as baseballs and planets. Quantum physics is a world we can’t easily see. If any part of quantum is substantially different from classical physics, it is that physics at the quantum scale is not only granular but also fuzzy.”

But Dr. Fortier points out that nature itself is fuzzy. When we zoom in on a digital image, it’s made of individual pixels, which seem to have well-defined boundaries. But, “If you were able to zoom in on the atoms and subatomic particles that make up the pixel, you would see that the subatomic particles aren’t well-defined — their boundaries and behavior are somewhat unclear. This is similar to drawing a “perfect” line with a pencil and ruler. If you looked at that line with a microscope, the edges would look more wobbly than straight.”

So, I guess you could say that a quantum computer sees the world in ways that are more closely aligned with how the world really is than a digital computer, which only gives us a sample of the world.

But still, as Einstein said, quantum behaviors are spooky.

Fortier’s article does make some of the quantum behaviors more approachable but still very hard for me to digest. For example, the one I’ve heard about since I was a kid: Light is both a wave and a particle. Sometimes it behaves one way, like when light waves give us rainbows, but when light hits a solar panel it acts like particles. It’s hard for me to understand that, but I can put away my qualms and just accept that it can be useful in both ways.

Then there’s “the Heisenberg uncertainty principle, which says that the act of measurement disturbs the quantum state of the object.” So, how can you base a computer on the states of quantum particles if your measurement disturbs them?

But for me, the spookiest of all is quantum entanglement — the quantum state of one particle is correlated with the state of another, no matter how far apart from each other they are. So, measuring one particle affects the state of its partner. Yet there is a practical use for entanglement — secure cryptographic keys.

All of this makes me think that I think that more than science is needed to develop practical quantum computers — engineers will have to accept working with technology that they can’t really understand.