SLAM with OAK

On-board localization (VIO) and SLAM (Simultaneous Localization And Mapping) on current OAK cameras (RVC2) aren’t yet supported.

Our upcoming Series 3 OAK cameras with RVC3 have Quad-core ARM A53 1.5GHz integrated into the VPU. There will be an open-source SLAM implementation on the RVC3. Users are be able to run custom containarized apps on the ARM, which will allow other companies (which specialize in VIO/SLAM) to port their software stacks to our cameras and license it.

Several SLAM and localization projects that support OAK-D cameras:

On-device SuperPoint for localization and SLAM

A customer shared with us a solution that was running the SuperPoint (Github repo, Arxiv paper) feature extraction NN on-device (on RVC2-based OAK-D) and then used the features for localization and SLAM (on the host computer).

RAE on-device VIO & SLAM

The demo below shows how you can run on-device VIO & SLAM on the RAE robot using Spectacular AI SDK. Disparity matching and feature extraction + tracking are done on accelerated blocks on the RVC3 chip. Features are then combined with disparity to provide tracked 3D points used by VO, and Spectacular AI SDK fuses that with IMU data to provide accurate localization of the robot.

Syncing frames and IMU messages

For VIO/SLAM solutions, you would want to sync IMU messages with the middle of the exposure. For exposure timings and timestamps, see Frame capture graphs for details. See here for IMU/frame syncing demo.

Some more advance algorithms weight multiple IMU messages (before/after exposure) and interpolate the final value.

Got questions?

Head over to Discussion Forum for technical support or any other questions you might have.