# VIO and SLAM

VIO (visual odometry) and SLAM (simultaneous localization and mapping) estimate camera trajectory while building and updating a
map from synchronized stereo, IMU and other sensor data. OAK devices provide synchronized sensing and flexible software paths, so
you can start with native DepthAI and scale into external SLAM ecosystems.

## Integration paths

 1. Native in DepthAI v3: Use the [RTab Map VIO SLAM
    example](https://docs.luxonis.com/software-v3/depthai/examples/rvc2/vslam/rtab_map_vio_slam.md), built on the open-source
    [RTAB-Map](https://github.com/introlab/rtabmap) project and running natively in the DepthAI v3 library.
    
    > **Native VIO/SLAM runs on RVC2 today; RVC4 is available in early access**
    > Native VIO/SLAM support is currently available on RVC2 devices. RVC4 support is coming soon; contact us if you need early
    access.

 2. ROS 2 stack: Integrate SLAM through the RTAB-Map package in ROS 2. We provide easy-to-run rtabmap_ros [launch files for
    OAK](https://docs.luxonis.com/software-v3/depthai/ros/driver.md), either as a standalone setup or as a [drop-in replacement
    for RealSense cameras](https://docs.luxonis.com/software-v3/depthai/ros/driver.md). Examples:
    
    * [RTABMap ROS2 driver](https://docs.luxonis.com/software-v3/depthai/ros/driver.md)
    * You can follow our [rtabmap_ros + wheel odometry on
      OAK](https://discuss.luxonis.com/blog/6153-visual-slam-using-wheel-odometry-and-luxonis-oak-d-pro-on-ros-2) to integrate
      SLAM based on wheel odometry.

 3. Spectacular AI: Use the [Spectacular AI OAK wrapper](https://spectacularai.github.io/docs/sdk/wrappers/oak.html) for real-time
    VIO/SLAM pipelines.

 4. NVIDIA stack: Build with [PyCuVSLAM](https://github.com/nvidia-isaac/cuVSLAM) when targeting NVIDIA and Isaac workflows.

## Typical use cases

 1. Robot localization: keep low-drift pose estimates where GNSS is unavailable or unreliable.
 2. Mapping and relocalization: map environments once, then relocalize reliably in repeat runs.
 3. Spatial autonomy: combine VIO pose with [depth](https://docs.luxonis.com/overview/toplevel-features/depth.md) and
    [AI](https://docs.luxonis.com/overview/toplevel-features/ai_capabilities.md) for navigation and obstacle-aware behavior.

## Guides and examples

### DepthAI: RTab Map VIO SLAM

Native DepthAI v3 SLAM pipeline using stereo + IMU

[DepthAI: RTab Map VIO SLAM](https://docs.luxonis.com/software-v3/depthai/examples/rvc2/vslam/rtab_map_vio_slam.md)

### DepthAI: Basalt VIO + RTAB-Map

Combine Basalt VIO with RTAB-Map SLAM in one pipeline

[DepthAI: Basalt VIO + RTAB-Map](https://docs.luxonis.com/software-v3/depthai/examples/rvc2/vslam/basalt_vio_rtab.md)

### ROS2 RTAB-Map Integration

Visual SLAM with wheel odometry and OAK-D Pro

[ROS2 RTAB-Map Integration](https://discuss.luxonis.com/blog/6153-visual-slam-using-wheel-odometry-and-luxonis-oak-d-pro-on-ros-2)

### Spectacular AI OAK Wrapper

Use OAK cameras directly in Spectacular AI SDK workflows

[Spectacular AI OAK Wrapper](https://spectacularai.github.io/docs/sdk/wrappers/oak.html)

### NVIDIA PyCuVSLAM

Integrate OAK pipelines with cuVSLAM / Isaac workflows

[NVIDIA PyCuVSLAM](https://github.com/nvidia-isaac/cuVSLAM)

### ROS VIO and SLAM Docs

General ROS guidance for VIO and SLAM with OAK cameras

[ROS VIO and SLAM Docs](https://docs.luxonis.com/software/ros/vio-slam.md)

### Need assistance?

Head over to [Discussion Forum](https://discuss.luxonis.com/) for technical support or any other questions you might have.
