A customer shared with us a solution that was running the SuperPoint (Github repo, Arxiv paper) feature extraction NN on-device (on RVC2-based OAK-D) and then used the features for localization and SLAM (on the host computer).
RAE on-device VIO & SLAM
The demo below shows how you can run on-device VIO & SLAM on the using Spectacular AI SDK. Disparity matching and feature extraction + tracking are done on accelerated blocks on the RVC3. Features are then combined with disparity to provide tracked 3D points used by VO, and Spectacular AI SDK fuses that with IMU data to provide accurate localization of the robot.
Syncing frames and IMU messages
For VIO/SLAM solutions, you would want to sync IMU messages with the middle of the exposure. For exposure timings and timestamps, see Frame capture graphs for details. See here for IMU/frame syncing demo.Some more advance algorithms weight multiple IMU messages (before/after exposure) and interpolate the final value.
Drone on-device NN-based localization
A novel AI approach for spatial localization using an OAK camera. The drone runs on-device neural inferencing to provide positioning of the drone. Paper here, Github repo here.