Skip to yearly menu bar Skip to main content


Poster

egoPPG: Heart Rate Estimation from Eye-Tracking Cameras in Egocentric Systems to Benefit Downstream Vision Tasks

Björn Braun · Rayan Armani · Manuel Meier · Max Moebus · Christian Holz


Abstract:

Egocentric vision systems aim to understand the spatial surroundings and the wearer's behavior inside it, including motions, activities, and interaction with objects. Meta's Project Aria 2 recently added a heart rate (HR) contact sensor to additionally capture the wearer's cardiac activity, which can impact the person's attention and situational responses. In this paper, we propose egoPPG, a novel non-contact-based method to recover cardiac activity from the eye-tracking cameras in previous egocentric vision systems. Our method continuously estimates the person's photoplethysmogram (PPG) from areas around the eyes and fuses motion cues from the headset's inertial measurement unit to track HR values. We demonstrate egoPPG's downstream benefit for existing egocentric datasets on EgoExo4D, where we find that augmenting existing models with tracked HR values improves proficiency estimation by 14%. To train and validate egoPPG, we collected a dataset of 13+ hours of eye-tracking videos from Project Aria and contact-based blood volume pulse signals as well as an electrocardiogram (ECG) for ground-truth HR values. 25 participants performed diverse everyday activities such as office work, cooking, dancing, and exercising, which induced significant natural motion and HR variation (44 - 164 bpm). Our model robustly estimates HR (MAE=7.67 bpm) and captures patterns (r=0.85). Our results show how egocentric systems may unify environmental and physiological tracking to better understand user actions and internal states. We will release our code, dataset, and HR augmentations for EgoExo4D for future research.

Live content is unavailable. Log in and register to view live content