An autonomous HVAC system being developed at the University of Michigan takes the thermostat off the wall and finds personalized thermal settings for individuals in an indoor environment. The Human Embodied Autonomous Thermostat, or “HEAT," uses cameras to identify facial temperature, and adjust the room's air accordingly.

The new take on flexible climate control could have valuable applications in the home, office, and hospitals, especially as airborne illnesses like COVID-19 change how individuals interact indoors.

“COVID presents a variety of new climate control challenges, as buildings are occupied less consistently and people struggle to stay comfortable while wearing masks and other protective gear,” said project principal investigator and study co-author Carol Menassa , associate professor of civil and environmental engineering.

Carol Menassa
Prof. Carol Menassa

The “HEAT” system, featured in the July 2020 issue of Building and Environment , pairs thermal cameras with three-dimensional video cameras to track occupants' facial temperatures and determine if they are hot or cold.

The cameras, distributed around a given room, recognize individuals' faces, pixilate the different regions, and return the temperature of each pixel. Since only facial temperature is needed to initiate air-circulation decisions, individuals do not need to wear any detection devices, like wristbands, to measure full body temperature and provide the data.

Facial temperature, it turns out, is a good predictor of comfort, says Menassa. When we’re too hot, our blood vessels expand to radiate additional heat, raising facial temperature; when we’re too cold, our vessels constrict.

The HEAT system feeds the room's facial temperature data to a predictive model, which compares the data with information about occupants’ thermal preferences and determines the temperature that will keep the largest number of occupants comfortable with minimum energy expenditure. HEAT’s predictive model was built by U-M industrial operations and engineering associate professor Eunshin Byon, who is also an author on the study.

In a way, HEAT works like an Internet-enabled "smart" thermostat. Occupants teach the autonomous HVAC tech by periodically giving the system feedback from their smartphones on a three-point scale: “too hot,” “too cold,” or “comfortable.” After a few days, HEAT learns their preferences and operates independently.

The University of Michigan research team is working with power company Southern Company Research and Development to begin testing HEAT in its Alabama offices, where cameras will be mounted on tripods in the corners of rooms. The cameras collect temperature data anonymously, and all footage is deleted seconds after processing.

Menassa and the team are also studying the model's effectiveness beyond homes and offices — in hospitals, for example, where today's care providers perhaps struggle to stay comfortable under masks and other protective equipment.

In partnership with the U-M school of nursing, Menassa’s research group has already conducted a pilot study that explored how the system can be used to provide personalized thermal comfort for nurses working in healthcare environments such as chemotherapy administration units. See the video below to learn more about the study.

In an edited Q&A with Tech Briefs below, Prof. Menassa explains where we're most likely to see the autonomous system.

Tech Briefs: What does the implementation of the system look like? Can you take us into an application scenario that you envision?

Prof. Carol Menassa: The system can be used in single- or multi-occupancy spaces. A network of thermal cameras can be distributed in the environment. These thermal cameras capture human faces and detect their temperature from different distances and angles. We then use the identified features in the images, along with our machine learning algorithm, to determine an individual and collective comfort profile. This will then allow us to determine the optimal set point temperature for the space that minimizes discomfort among the occupants. This information can then be directly conveyed to the HVAC system in case an adjustment (heating or cooling) to the setpoint temperature in the space is necessary.

Tech Briefs: What were the most surprising things that you learned from your studies, when this system was put into use?

Prof. Carol Menassa: The most surprising aspect for us was discovering that thermal comfort is highly personalized and depends on specific facial features for different people. For example, cheek temperature might be highly indicative of thermal comfort for some but not everyone).

Tech Briefs: What are the biggest challenges when you’re trying to get temperature right for, say, a group of ten people?

Prof. Carol Menassa: The most challenging aspect is trying to first identify and associate measurements with each of the individuals in the space and continuously do that even if they move from one location in the room to the other. As we adopted low-cost thermal cameras as opposed to expensive ones, the cameras need to be placed close to occupants (2 meters) to achieve an acceptable accuracy. Occlusions can also occur in a complex environment where an individual is blocked by others or the furniture.

The other important challenge we addressed is being able to use data collected from different people at different angles and distances from the camera network and still be able to use our algorithm to accurately predict a person’s thermal profile.

The HEAT Prototype from the University of Michigan
The HEAT Prototype (Image Credit: UMich)

Tech Briefs: When and how did this idea come about?

Prof. Carol Menassa: We started working on using data (e.g., heart rate, skin temperature) obtained from wearable devices back in 2015 and subsequently developed a phone application that is capable of curating all this data as well as human feedback to determine a person’s comfort zone. Through our testing of this system, we realized that the greatest hurdle to its widespread application is the fact that occupants are still required to provide feedback for the system to adjust the thermal setting of the space. With time people get busy, and continuously providing feedback becomes a distraction.

Tech Briefs: And how did you begin involvement with the system?

Prof. Carol Menassa: In 2017 we decided to start exploring non-intrusive methods to predict a person’s thermal comfort. At that time, a recent trip through an airport that used thermal cameras to detect if visitors to a country were sick sparked my interest in exploring that approach. A grant funded through the US National Science Foundation allowed us to explore this idea and develop our technology.

Tech Briefs: How does COVID-19 give your idea greater importance?

Prof. Carol Menassa: COVID-19 is changing the way we use indoor spaces in a dramatic way. With the recent mandated lockdowns, people are forced to spend more time indoors and do work in new spaces not experienced before. Maintaining a healthy and comfortable indoor environment in homes and office spaces is critical to ensure people are able to perform their tasks successfully. Our research indicates that thermal comfort plays an important role in determining how an individual’s performance and mental workload change with changes in indoor temperature while performing the same task. Thus, being able to autonomously and non-intrusively control the environment to provide ideal thermal comfort for individuals can enhance their productivity and wellbeing.

HEAT is available as a licensable technology through the U-M Office of Technology Transfer.

What do you think? Share your questions and comments below.