Skip to main content

Aria Everyday Activities Dataset

Overview

The Aria Everyday Activities (AEA) dataset provides sequences collected using Project Aria glasses in a variety of egocentric scenarios, including: cooking, exercising, playing games and spending time with friends. The goal of AEA is to provide researchers with data to engage in solving problems related to the challenges of always-on egocentric vision. AEA contains multiple activity sequences where 1-2 users wearing Project Aria glasses participate in scenarios to capture time synchronized data in a shared world location.

For more information on the activities, see:

  • Activities: details about the scenarios and where specific activities are in the recording sequence
  • Recording Scripts: more details about each scenario

Aria Everyday Activities Dataset was first released as part of the Aria Pilot Dataset (Project Aria’s first open dataset). It has now been improved with a new data format, new tooling that enables it to be used with Project Aria Tools, and additional machine perception data. The data was recorded using Recording Profile 9.

About the data

In addition to providing raw sensor data from Project Aria glasses, this dataset also contains annotated speech to text data, and results from our Machine Perception Services that provide additional context to the spatial-temporal reference frames. We provide:

The dataset contains:

  • 143 recordings for Everyday Activities
  • Recording in 5 locations, with 53 sequences of 2 users simultaneous recording
  • Over 1 million images
  • Over 7.5 accumulated hours

Image of point cloud of a room with trajectories and RGB frames from specific parts of the map

Figure 1: Shared 3D Global Trajectories for Multi-User Activities in the Same Location

Documentation

The AEA section of this wiki covers: