By detecting nearly imperceptible changes in skin color, emerging imaging technologies have been able to extract pulse rate, breathing rate, and other vital signs from a person facing a camera. The videography tools have struggled, however, to compensate for low light conditions, dark skin tones, and movement.

Rice University’s Ashok Veeraraghavan, an assistant professor of electrical and computer engineering, sits in front of a webcam to have his pulse and breathing analyzed. (Image Credit: Jeff Fitlow/Rice University)

Rice University graduate student Mayank Kumar is currently leading a project to refine the video monitoring of vital signs. Kumar and his team have developed the DistancePPG algorithm to average skin-color change signals and track a subject’s entire face, including his or her nose, eyes, and mouth.

The DistancePPG software could ultimately end up on smartphones, allowing users to assess their health at any time.

Imaging Technology: How did the DistancePPG project come about?

Mayank Kumar: We visited Texas Children’s Hospital some time back and found multiple probes and patches that are used in neonatal wards to continuously monitor babies’ vital signs. These patches and probes damage the delicate skin of premature babies. That motivated us to think about developing new techniques for monitoring vital signs without touching the babies.

Rice University graduate student Mayank Kumar (Image Credit: Jeff Fitlow/ Rice University)

DistancePPG uses a camera — an iPhone camera or even a normal Web camera — to record the videography of the person facing it. From that video we can extract the vital signs: pulse rate as well as the breathing rate.

We started by testing a few previous methods of using a camera to record vital signs. During testing, we faced a problem: that this idea does not work for people having darker skin tones. It also does not work under low lighting conditions, like ambient lighting, or even if the person moves in front of the camera. Our algorithm solves these challenges, and it opens up avenues for many new applications.

Imaging Technology: How does the camera-based vital sign monitoring work?

Kumar: In camera-based methods, we look at the slight changes in skin color when the heart pumps the blood. At each heartbeat, there is more blood in the face; the face starts to get slightly red. We cannot see these small color changes with our naked eye, but a normal camera is able to capture that, albeit with some difficulties, as the skin-color change due to blood flow is really small.

The basic challenge was: How do we extract this small signal from a sea of “noise” reliably? Darker skin tones as well as low lighting conditions made the SNR (signal-to-noise ratio) even worse. We devised an algorithm to improve the signal-to-noise ratio.

Imaging Technology: What are the key innovations of this software/ algorithm?

Kumar: There are three key innovations: First, the software uses a novel method to identify which regions in the face are better for estimating vital signs. As depth and density of arteries underneath the skin surface vary, the signal strength of skin-color change varies in different regions of the face. Our algorithm provides a “goodness” score for each facial region by directly analyzing the recorded video of the face, thus providing a way to reject not-so-good regions.

From left, researchers developing the DistancePPG software include Ashutosh Sabharwal, Mayank Kumar, and Ashok Veeraraghavan. (Image Credit: Jeff Fitlow/Rice University)

Second, the new method combines the small skin-color change signals obtained from the different areas, using a weighted averaging algorithm to maximize the signal quality (signal-to-noise ratio) of the estimated signal and thereby improving the accuracy of vital sign estimation.

Finally, the software uses an improved method for tracking the face during naturalistic motion to compensate for motion-related artifacts.

Imaging Technology: What are the more intelligent ways that DistancePPG is handling motion?

Kumar:For improving vital sign estimation accuracy under motion scenarios, one needs to work on two aspects: (i) tracking the person’s facial movement in front of the camera, and (ii) compensating for the change of skin surface reflection during motion due to change in the angle of the camera. We tackle the first challenge by using a deformable face model to divide the face into multiple regions, which are tracked separately. For the second aspect, we have used time and frequency filters to separate out small skin-color change signals from large surface reflection changes due to motion.

Imaging Technology: Why do you think it’s necessary to have this non-contact method?

Kumar: It all depends on the context. One is in hospital settings, where newborn and premature babies must be continuously monitored for vital signs. Pulse oximeters and wired probes are put on premature babies. Many times, [the probes] have to be taken off and put back on, and the skin is very delicate. If we can replace the instruments with a camera, which can constantly monitor the baby, and which can reliable estimate vital signs, then we can get rid of those wires. You can also think of people who have burns, who also may have probes placed on them.

Imaging Technology: How do you envision the technology being used with computers, tablets, and other software?

Kumar: As we are making our system robust against natural motion, it will become increasingly feasible to run our algorithm in the background in tablets and laptops to monitor vital signs like pulse rate and breathing rate. Continuous background monitoring of vital signs will open up new ways to provide better care, particularly for patients having heart problems.

Doctors can monitor their patients’ vital signs through video by analyzing subtle changes in skin color. The new software improves the technique by keying on regions of the face to help compensate for different skin tones, changes in lighting, and movement. (Image Credit: Mayank Kumar/Rice University)

Because of our higher signal-to-noise ratio, we can determine how the beat-to-beat pulse interval (also known as pulse rate variability) changes over time. Pulse rate variability is a critical parameter for any heart-related ailments. You can monitor your beat time; you can continuously send that data to your doctors.

Imaging Technology: Do you imagine this software being used in a mainstream way on mobile devices? What’s next for DistancePPG?

Kumar: I imagine it more to be used as a patient well-being/wellness app in smartphones and PCs. We are working with partners to see it through. We have developed a PC/Mac application which can continuously measure vital signs, and we are currently testing it in realistic scenarios. We are also planning to develop apps for tablets and mobile devices in the near future.

We are currently working to further improve performance under motion scenarios. Based on our current understanding, we need to better model the change in skin surface reflection during motion so that we can filter out motion-related corruptions and reliably estimate pulse-related skin-color change signals, even under large motion.

For more information about Rice University’s DistancePPG software, visit http://sh.rice.edu/camera_vitals.html .