ON THIS PAGE

  • VIO and SLAM
  • On-device SuperPoint for localization and SLAM
  • RAE on-device VIO & SLAM
  • Syncing frames and IMU messages
  • Drone on-device NN-based localization

VIO and SLAM

OAK cameras can also be used for localization (VIO) and SLAM (Simultaneous Localization And Mapping).
  • RVC2: compute - especially CPU performance - on RVC2 is quite limited, so for VIO/SLAM you'd need to run the algorithm on the host computer.
  • RVC3: as it has Quad-core ARM, you can run VIO/SLAM algorithms on the camera itself. See RAE on-device VIO & SLAM.
Several SLAM and localization projects that support OAK cameras:

On-device SuperPoint for localization and SLAM

A customer shared with us a solution that was running the SuperPoint (Github repo, Arxiv paper) feature extraction NN on-device (on RVC2-based OAK-D) and then used the features for localization and SLAM (on the host computer).

RAE on-device VIO & SLAM

The demo below shows how you can run on-device VIO & SLAM on the using Spectacular AI SDK. Disparity matching and feature extraction + tracking are done on accelerated blocks on the RVC3. Features are then combined with disparity to provide tracked 3D points used by VO, and Spectacular AI SDK fuses that with IMU data to provide accurate localization of the robot.

Syncing frames and IMU messages

For VIO/SLAM solutions, you would want to sync IMU messages with the middle of the exposure. For exposure timings and timestamps, see Frame capture graphs for details. See here for IMU/frame syncing demo.Some more advance algorithms weight multiple IMU messages (before/after exposure) and interpolate the final value.

Drone on-device NN-based localization

A novel AI approach for spatial localization using an OAK camera. The drone runs on-device neural inferencing to provide positioning of the drone. Paper here, Github repo here.