Telehealth has become a critical way for doctors to still provide health care while minimizing in-person contact during COVID-19. But with phone or online appointments, it’s harder for doctors to get important vital signs from a patient, such as their pulse or respiration rate, in real time.

A method uses the camera on a person’s smartphone or computer to take their pulse and respiration signal from a real-time video of their face. For machine learning to be helpful in remote health sensing, the system must identify the region of interest in a video that holds the strongest source of physiological information — such as pulse — and then measure that over time. Since every person is different, the system must quickly adapt to each person’s unique physiological signature and separate this from other variations such as what they look like and what environment they are in.

The team’s system is privacy preserving — it runs on the device instead of in the cloud — and uses machine learning to capture subtle changes in how light reflects off a person’s face, which is correlated with changing blood flow. Then it converts these changes into both pulse and respiration rate.

The first version of this system was trained with a dataset that contained both videos of people’s faces and “ground truth” information: each person’s pulse and respiration rate measured by standard instruments in the field. The system then used spatial and temporal information from the videos to calculate both vital signs. It outperformed similar machine learning systems on videos where subjects were moving and talking. But while the system worked well on some datasets, it still struggled with others that contained different people, backgrounds, and lighting — a common problem known as “overfitting.”

The researchers improved the system by having it produce a personalized machine learning model for each individual. Specifically, it helps look for important areas in a video frame that likely contain physiological features correlated with changing blood flow in a face under different contexts such as different skin tones, lighting conditions, and environments. From there, it can focus on that area and measure the pulse and respiration rate.

While this new system outperforms its predecessor when given more challenging datasets, especially for people with darker skin tones, there is still more work to do. There is still a trend toward inferior performance when the subject’s skin type is darker, in part because light reflects differently off of darker skin, resulting in a weaker signal for the camera to pick up.

Any ability to sense pulse or respiration rate remotely provides new opportunities for remote patient care and telemedicine. This could include self-care, follow-up care, or triage, especially when someone doesn’t have convenient access to a clinic.

For more information, contact Sarah McQuate at This email address is being protected from spambots. You need JavaScript enabled to view it.; 206-543-2580.