Skip to yearly menu bar Skip to main content


Tutorial

Fourth Hands-on Egocentric Research Tutorial with Project Aria, from Meta

James Fort

[ ] [ Project Page ]
Mon 20 Oct 11 a.m. PDT — 2 p.m. PDT

Abstract:

Project Aria is a research device that is worn like a regular pair of glasses, for researchers to study the future of computer vision with always-on sensing. Sensors in Project Aria capture egocentric video and audio, in addition to eye-gaze, inertial, and location information. On-device compute power is used to encrypt and store information that, when uploaded to separate designated back-end storage, helps researchers build the capabilities necessary for AR to work in the real world. In this fourth tutorial, in addition to sharing research from academic partner program members, we will also provide an introduction to the second generation of Aria glasses ’Aria Gen 2’, announced in February. As part of this introduction, we will provide live hands-on demo of Aria Research Kit (including Gen 2 glasses), describe how researchers can gain access to the Project Aria academic program, and demonstrate how open-source tools can be used to accelerate research for specific research challenges, including visual and non-visual localization and mapping, static and dynamic object detection and spatialization, human pose estimation, and building geometry estimation. We will review new open datasets from Meta and academic partners partners, including a dataset of 6000+ 3D objects with Aria captures for each object to facilitate novel research on egocentric 3D object reconstruction, and a review of ego-perception challenges and benchmarks associated with all datasets, including a demonstration of methods for approaching each challenge.

Live content is unavailable. Log in and register to view live content