Skip to main content

Project Aria Docs FAQ

What is Project Aria?

Project Aria is a research program, with supporting spatial AI machine perception technologies and open science initiatives, created to serve as foundational building blocks for higher-level contextualized AI and the novel compute and interaction paradigms needed in order to make AR successful.

Project Aria glasses are at the program's core. A multimodal data collection device that collects data from the most human perspective. Project Aria glasses have five cameras (two Mono Scene, one RGB, and two Eye Tracking cameras) as well as non-visual sensors (two IMUs, magnetometer, barometer, GPS, Wi-Fi beacon, Bluetooth beacon and Microphones). On-device compute power is used to store information into VRS files which preserve the integrity of individual sensor streams that are time aligned using shared time stamps.

The whitepaper, “Project Aria: A New Tool for Egocentric Multi-Modal AI Research”, provides more details about Project Aria glasses, the data generated and motivation for the program.

What Project Aria data is available?

Researchers have used Project Aria to develop a wide range of powerful datasets. Open datasets powered by Project Aria data include:

Preview some Aria open data prior to downloading via the Aria Dataset Explorer.

What can Project Aria be used for?

Project Aria devices can be used to:

  • Collect data
    • In addition to capturing data from a single device, TICSync can be used to capture time synchronized data from multiple Aria devices in a shared world location
    • Cloud based Machine Perception Services (MPS) are available to generate SLAM, Multi-Slam, Eye Gaze and Hand Tracking derived data outputs. Partner data is only used to serve MPS requests. Partner data is not available to Meta researchers or Meta’s affiliates.
  • Stream data
    • Use the Project Aria Client SDK to stream and subscribe to data on your local machine

A range of models have been created using Aria data, including:

  • EgoBlur: an open source AI model from Meta to preserve privacy by detecting and blurring PII from images. Designed to work with egocentric data (such as Aria data) and non-egocentric data.
  • Project Aria Eye Tracking: an open source inference code for the Pre March 2024 Eye Gaze Model used by Machine Perception Services (MPS)
  • SceneScript: an AI model and method to understand and describe 3D spaces

What is the software ecosystem?

While some tools run on other platforms, when developing desktop tools we primarily support:

  • x64 Linux distributions of:
    • Fedora 36, 37, 38
    • Ubuntu jammy (22.04 LTS) and focal (20.04 LTS)
  • Mac Intel or Mac ARM-based (M1) with MacOS 11 (Big Sur) or newer

To use the Python version of our tools, you'll generally need Python 3.9+ (3.10+ if you are on Apple Silicon). Venv virtual environments are used and tested for most applications, in most instances Conda should also work, with the exception of the Project Aria Client SDK.

Project Aria Tools

Project Aria Tools provides open source Python and C++ code and APIs for working with Aria data. Due to the multimodal nature of Project Aria data, the data is stored in VRS files. Project Aria Tools provides API wrappers for using VRS tools, specifically tailored to working with Aria data. Using Project Aria Tools should make it easier to plug your data into other applications with more accessible formatting and labeling.

If there are VRS functions you wish Project Aria Tools had, please contact us using any of our Support Channels.

APIs

  • DataProvider
    • Read Project Aria sequences and raw sensor data (VRS files)
  • Calibration
    • Retrieve calibration data and interact with Aria camera models
  • MPS
    • Read Project Aria MPS (derived) data
  • Images
    • Python Visualization Guide
    • C++ Visualization Guide
    • Tooling enables users to:
      • Visualize time-synced recordings for one or more Project Aria sequences.
      • Visualize recordings and see 3D visualizations of pre-computed trajectory using time-synced data
      • Display an interactive visualization of the Aria VRS RGB frames along with MPS data (Closed loop trajectory, Global point cloud, Wearer eye gaze).
      • Display the relative position and orientation of Project Aria glasses sensors (cameras, IMUs, microphones, magnetometer & barometer) in a common reference. (Python only)

Check the Project Aria Tools release log for the latest updates.

Data Review

The vrs_to_mp4 script exports a VRS file to MP4 and was created to help researchers send files for review more easily. Please note, MP4 files will not contain the full sensor suite or the time stamps necessary to align with other recordings.

Code samples

Aria Research Kit (ARK)

Project Aria Tools code that is only relevant if you have access to Project Aria Glasses and are collecting your own data:

Aria Research Kit

The Aria Research Kit (ARK) provides Aria glasses, services and tools to enable researchers to gather and process their own Aria data. Go to Aria Research Kit intro page at projectaria.com to find out more, or to apply to become a research partner.

The following SW and capabilities that are only relevant if you have access to Project Aria glasses and can collect your own data.

Mobile Companion app (Required)

The Mobile Companion App, provides the ability to interact & record with your Aria glasses via Bluetooth. When you receive your device, you must set them up using the Mobile Companion app, so that your glasses can receive OTA software updates.

If you're using Android, your mobile device will need:

If you're using iOS:

Project Aria Client SDK with CLI

Through the Client SDK with CLI, you can directly interact with Aria glasses and stream data from the glasses to a computer. Not supported on Ubuntu focal.

After connecting the Project Aria glasses and PC via USB or WiFi, the SDK enables users to:

  • Retrieve device information, status and sensor calibration data
  • Configure, start and stop recording for a single device
  • Configure, start and stop recording for multiple time synchronized devices (using TICSync)
  • Configure, start and stop streaming from the glasses
  • Subscribe to streaming data from the glasses
  • Use TICSync to create time synchronized recordings with multiple Aria glasses

You will need to install the Client SDK with CLI using a venv virtual environment. The authorization certificates required to pair your glasses with the computer will not be sent correctly in Conda.

MPS CLI

Accessed via the PyPi installation of Project Aria Tools, the Project Aria MPS Command Line Interface (MPS CLI) provides a streamlined way to request Machine Perception Services (MPS):

Aria Studio

Aria Studio is an application will launch in your default web-browser and provides the ability to:

  • Examine preview thumbnails and metadata of Aria recordings (on device or stored locally)
  • Transfer recordings from the device to the local computer or delete recordings stored on the device or local computer
  • Visualize recordings
    • We show visualizations of all the sensor streams except for audio. Audio outputs are not supported at this time.
  • Group recordings for multi-sequence processing jobs
    • Grouped recordings are a special feature unlocked by Aria Studio. Users can group VRS recordings from anywhere in their directory for organizational purposes, or to request Multi-SLAM outputs.
    • Groups are stored on your local machine and associated with your Aria user account
  • Request and manage Machine Perception Services (MPS) requests

Like the MPS CLI, you can have granular control over settings such as uploads and download speeds by customizing $HOME/.projectaria/mps.ini.

Desktop Companion app

The Desktop Companion app provides the ability to record, transfer, process, validate and visualize Aria data through a desktop interface. Users can also request MPS data for single recordings. Please note, Aria Studio is the preferred GUI for working with Aria data.

While Aria data can be downloaded using the Desktop app, it may be faster to download the data using other means.

The Desktop Companion app can be installed on:

  • Mac Intel or Mac ARM-based (M1) with MacOS 11.3 (Big Sur) or newer
  • Windows 10 x86_64 with DirectX (or OpenGL but the latest drivers should be installed)
  • Windows 11 may work, but is not actively supported at this time
  • Linux, debian package for Ubuntu, 22.04 LTS version. The app was only tested for that specific version under Gnome 42.5 and X11 (X Server) as well as Wayland. Any other debian distribution (Ubuntu 22.04 fork such as Kubuntu, Mint etc.) or environment may or may not work.

EgoBlur

EgoBlur is an open source AI model from Meta to preserve privacy by detecting and blurring PII from images. We provide two FasterRCNN-based detector models, for face blurring and license plates and perform consistently across the full range of ‘responsible AI labels’, as defined by the CCV2 dataset.

Project Aria Eye Tracking

Project Aria Eye Tracking provides an open source inference code for the Pre March 2024 Eye Gaze Model used by MPS.

  • Provides the ability to generate eye gaze data using the old model, if that is more suited to your research use case or if you’re not able to request MPS
  • This code can be used on downloaded data or when streaming data using the Aria Client SDK

How do I use Aria data?

Because of Aria’s unique data structure, we recommend using Project Aria Tools as an interface to access the data via an abstraction (an X_provider string) that allows users to iterate through all the sensor recordings.

How do I use the IMU data?

We recommend using Trajectory MPS outputs instead of raw IMU data wherever possible. Go to MPS Code Snippets for how to load open loop or closed loop trajectory data.

What calibration information is available?

General calibration concepts

SLAM MPS Calibration data

Online_calibration.jsonl is a SLAM MPS output that contains one json online calibration record per line. Each record is a json dict object that contains timestamp metadata and the result of online calibration for the cameras and IMUs.

The calibration parameters contain intrinsics and extrinsics parameters for each sensor as well as a time offsets that best temporally align their data.

In-Session Eye Gaze Calibration

This is not related to standard calibration. In-Session Eye Gaze Calibration enables researchers to improve the eye gaze estimations in Eye Gaze MPS outputs, enabling researchers to more accurately determine where wearers are looking during the recordings.

When users collect data using Aria glasses, they can initiate In-Session Eye Gaze Calibration. Once they’ve performed this task personalized, as well as general Eye Gaze MPS, can be generated. Every person is unique in terms of how they move their eyes and look at objects, so the personalized_eye_gaze estimation is expected to be more accurate for the individual.

Should I use VRS or Project Aria Tools?

Project Aria Tools provides VrsDataProvider which offers some, but not all, of the VRS API capabilities. Go to How Project Aria Uses VRS for more information.

We’ve tailored VrsDataProvider to meet the requirements of people using Aria data. If there are capabilities in the VRS API that you would like to access when using Project Aria Tools, please use one of our Support channels to let us know what you’re looking for and what features you would like.

How do I get involved?

The Project Aria Contact Us page contains a variety of ways to get in touch. We’d love to hear from you.

Download and explore our various open datasets and let us know how it goes!

For questions or feature requests relating to Project Aria Tools, we encourage you to post to the Issues page on GitHub.

Go to our Attribution and Contributing page if you’d like to find out how to cite Project Aria or contribute to our open tooling.

Research Challenges are posted to Projectaria.com. Open research challenges allow us to collectively accelerate the state-of-the-art with the community. We believe these challenges will help to establish a baseline for external researchers to build and foster reproducible research on egocentric Computer Vision and AI/ML algorithms for scene perception, reconstruction and understanding, and contextual AI.

If you’d like to collect your own data, or use Project Aria glasses in streaming applications go to Aria Research Kit intro page at projectaria.com to find out how to become a research partner.