It’s hard to get through the day without hearing or reading about AI. Most of it is about the great things it will do — or the terrible things it will do — but there is a consensus that it will change our lives. Once you start digging into the details, you learn that it relies on huge amounts of data, which are stored in giant data centers, otherwise known as the “cloud.” That works fine for generative AI, or to sort through tons of data to extract meaningful information. But, cloud data is not as useful for real-time systems. So, it’s helpful to think of AI being used in two major application areas: one is data analytics, and the other is manipulation of data for real-time automated systems. AI in real-time systems needs real-time data, and the source of real-time data is sensors. As I wrote in an earlier blog, “Without sensors there would be no IoT, or IIoT, or Industry 4.0.”
As editor of Sensing Technology magazine, I keep up with the latest developments in sensors. The typical news I see falls into a few different slots. A big one these days is “smart sensors,” which have the ability to do some onboard data manipulation. In my day, a sensor output would usually be 0 – 10 V or 4 – 20 mA. So, to use the sensor’s output for display and/or control, I would input the signal to a PLC and write some code to convert, say, 4 – 20 mA, to 0 – 100 °F. Or, I could buy an instrument that would do it for me. With smart sensors, all of that is included in the sensor package. But, as with everything else, AI is entering the picture for sensors, in the form of edge processing. Adding the ability to use AI for data analytics in the senor itself enables the sensor to determine what information is important to send, so as not to waste power sending useless data. Doing more preprocessing at the edge also means fewer bits need to be sent to the main controller or to the cloud, thereby reducing latency and speeding up response for real-time processes.
After decades of work as an EE, SAE Media Group’s Ed Brown is well into his second career: Tech Editor.
“I realized, looking back to my engineering days and watching all of the latest and greatest as an editor, I have a lot of thoughts about what’s happening now in light of my engineering experiences, and I’d like to share some of them now.”
Although these developments have made sensors much more sophisticated, the designer still has to figure out how to adapt a standard sensor for their particular system. So, a press release titled “ Sensors, Centered ,” caught my attention. It describes a different paradigm: instead of adapting the sensor to the system, sensor designers and sensor users should collaborate right from the start to develop the sensor to meet the needs of the user.
A new institute at North Carolina State University, the Institute for Connected Sensor-Systems (IConS), looks to do just that. Describing its mission, Alper Bozkurt, one of the institute’s advisors, said, “We try to come up with innovations, starting from the problem and then finding the solution, rather than having a solution and looking for a problem to solve … This is, I think, the main spirit of IConS, that it is a conversation.” I believe that this kind of collaboration leads to better, more efficient, more effective results.
For example, one of the projects supported by the institute is to determine the best types of sensors for early detection of mild cognitive impairment (MCI), so aggressive steps can be taken to slow its progress. Working with a group of people experiencing MCI and a control group, the researchers tested a combination of facial expression sensors, audio sensors, and physiological sensors such as heart rate and blood pressure. They then did cognitive testing while the sensors were attached — comparing results when the subjects were at rest and when they were exercising — in hopes of finding a particularly sensitive biomarker.
In another project, researchers in both sensors and in materials are working along with plant biologists. The biologists need sensors they can apply to plant surfaces without interfering with the plant’s natural functioning. They jointly developed ultrathin transparent electrodes to create the ideal sensor for this application.
In another collaborative project, the emphasis is on combining expertise in materials, sensors, and manufacturing processes to develop next-generation, single-walled carbon nanotube optical biosensors. The sensor developers used the materials experts to help them design a high-resolution precision geometry optical sensor. These sensors can be designed to sensitively respond to particular molecules and environmental conditions when implanted within complex biological spaces. At the same time, 3D-printing experts developed special inks to manufacture the sensors.
My takeaway is that collaboration among experts in different fields from the onset of a project essentially multiplies the talents of each. It is so much more effective than coming up with a design and then looking around for what’s available to implement it.

