Most methods for measuring vital signs in medical settings require some form of physical contact with the patient – a blood pressure cuff around the arm, for instance, or an electrocardiogram (EKG) probe on the chest.

“DistancePPG: Robust non-contact vital signs monitoring using a camera”Measuring vital signs optically

But for vulnerable premature babies in a neonatal intensive care unit (NICU), these devices can be dangerous. Their sensitive skin would be damaged by the continuous attachment and removal of monitors, opening the door to infections.

A team of researchers from Rice University is developing a way to track vital signs non-invasively. In a paper published in The Optical Society (OSA) journal Biomedical Optics Express, they are refining an emerging technique that uses a video camera to monitor blood volume.

As the circulatory system pumps blood throughout the body, the miniscule change in blood volume with each pulse results in a corresponding change in skin color. Though undetectable with the naked eye, the researchers can use a video camera to track these subtle variations and then extract information about blood volume – and ultimately vital signs.

The idea of using a camera to track vital signs is based on photoplethysmography (PPG), a way to measure physiological processes under the skin by monitoring subtle changes at the skin’s surface. Camera-based PPG has been previously studied, but its applications are somewhat limited: It works well on fair-skinned individuals sitting perfectly still in a well-lit room. However, if the patient has dark skin, is moving slightly, or is in poorly-lit surroundings, the technique may not work.

The Rice University research team created a new algorithm to address these challenges; potentially expanding the use of camera-based PPG’s to a wide range of clinical conditions. Instead of computing color change from the whole face as a single region, their algorithm combines measurements from several regions of the face, using a weighted average.

“Our key finding was that the strength of skin-color change signal is different in various regions of the face,” said Mayank Kumar, a graduate student at Rice and co-author of the new study. These variations arise because the ambient light is reflected differently, he explained. “Thus we devised an algorithm which combines the color-change signal obtained from different regions of the face using a weighted average. This improved the accuracy of derived vital signs, rapidly expanding the scope, viability, reach and utility of camera-based vital sign monitoring.”

Motion presents an even greater challenge. Even small facial movements can dramatically change the way different parts of the face reflect light. Those changes can dwarf the tiny color change signals that indicate a pulse, making it nearly impossible to distinguish the signal from the noise.

To mitigate this problem, the researchers implemented a face-tracking algorithm that identified the location of key features (eyes, nose, and mouth) in order to accurately track the facial regions as they changed positions across video frames.

“Interestingly, this technique has been known in other domains of computer vision, but has not been properly applied to the problem at hand,” said Kumar. “Once we understood the motion challenge, the tracking approach became obvious.”

To test the new tracking technology, the team monitored adults engaging in common activities. The face-tracker algorithm improved the photoplethysmography signal in situations with low levels of motion – when the patient was reading or watching a video, for example. However, it remained relatively inaccurate when the patient was talking or smiling. These larger movements changed the facial light reflectance more dramatically and made extracting a reliable signal difficult.

The project is part of Rice University’s Scalable Health Initiative, an interdisciplinary project aiming to bring affordable medical technology to consumers. The team plans to further improve PPG’s performance under motion, with the hope that the technology could have applications outside the NICU. For instance, smartphone apps using the technology could eventually allow patients to track their health using the front-facing cameras on their devices.

Paper: “DistancePPG: Robust non-contact vital signs monitoring using a camera,” Mayank Kumar et al., Biomedical Optics Express Vol. 6, Issue 5, 1565-1588 (2015).
doi: http://dx.doi.org/10.1364/BOE.6.001565

 OSA_primary-logoFounded in 1916, The Optical Society (OSA) is the leading professional organization for scientists, engineers, students and entrepreneurs who fuel discoveries, shape real-life applications and accelerate achievements in the science of light. Through world-renowned publications, meetings and membership initiatives, OSA provides quality research, inspired interactions and dedicated resources for its extensive global network of optics and photonics experts. OSA is a founding partner of the National Photonics Initiative and the 2015 International Year of Light. For more information, visit www.osa.org.