Software Stack
DepthAI
  • DepthAI Components
    • AprilTags
    • Benchmark
    • Camera
    • DetectionNetwork
    • EdgeDetector
    • Events
    • FeatureTracker
    • HostNodes
    • ImageAlign
    • ImageManip
    • IMU
    • Misc
    • Modelzoo
    • NeuralNetwork
    • RecordReplay
    • RGBD
    • Script
    • SpatialDetectionNetwork
    • SpatialLocationCalculator
    • StereoDepth
    • Sync
    • SystemLogger
    • VideoEncoder
    • Visualizer
    • Warp
    • RVC2-specific
  • Advanced Tutorials
  • API Reference
  • Tools

ON THIS PAGE

  • RTab Map VIO SLAM
  • Setup
  • Pipeline
  • Source code

RTab Map VIO SLAM

The example constructs a pipeline that integrates stereo cameras at 30 FPS, IMU at 100 Hz, and feature tracking with Shi-Thomasi corner detection, feeding into RTAB-Map VIO and SLAM for 3D occupancy grid and point cloud generation, with results visualized in real-time via a RerunNode.

Setup

This example requires the DepthAI v3 API, see installation instructions.

Pipeline

Source code

Python
C++

Python

Python
GitHub
1import time
2import depthai as dai
3from rerun_node import RerunNode
4
5# Create pipeline
6
7with dai.Pipeline() as p:
8    fps = 30
9    width = 640
10    height = 400
11    # Define sources and outputs
12    left = p.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_B, sensorFps=fps)
13    right = p.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_C, sensorFps=fps)
14    imu = p.create(dai.node.IMU)
15    stereo = p.create(dai.node.StereoDepth)
16    featureTracker = p.create(dai.node.FeatureTracker)
17    odom = p.create(dai.node.RTABMapVIO)
18    slam = p.create(dai.node.RTABMapSLAM)
19
20    params = {"RGBD/CreateOccupancyGrid": "true",
21              "Grid/3D": "true",
22              "Rtabmap/SaveWMState": "true"}
23    slam.setParams(params)
24
25    rerunViewer = RerunNode()
26    imu.enableIMUSensor([dai.IMUSensor.ACCELEROMETER_RAW, dai.IMUSensor.GYROSCOPE_RAW], 200)
27    imu.setBatchReportThreshold(1)
28    imu.setMaxBatchReports(10)
29
30    featureTracker.setHardwareResources(1,2)
31    featureTracker.initialConfig.setCornerDetector(dai.FeatureTrackerConfig.CornerDetector.Type.HARRIS)
32    featureTracker.initialConfig.setNumTargetFeatures(1000)
33    featureTracker.initialConfig.setMotionEstimator(False)
34    featureTracker.initialConfig.FeatureMaintainer.minimumDistanceBetweenFeatures = 49
35
36    stereo.setExtendedDisparity(False)
37    stereo.setLeftRightCheck(True)
38    stereo.setRectifyEdgeFillColor(0)
39    stereo.enableDistortionCorrection(True)
40    stereo.initialConfig.setLeftRightCheckThreshold(10)
41    stereo.setDepthAlign(dai.CameraBoardSocket.CAM_B)
42
43
44    # Linking
45
46    left.requestOutput((width, height)).link(stereo.left)
47    right.requestOutput((width, height)).link(stereo.right)
48    featureTracker.passthroughInputImage.link(odom.rect)
49    stereo.rectifiedLeft.link(featureTracker.inputImage)
50    stereo.depth.link(odom.depth)
51    imu.out.link(odom.imu)
52    featureTracker.outputFeatures.link(odom.features)
53
54    odom.transform.link(slam.odom)
55    odom.passthroughRect.link(slam.rect)
56    odom.passthroughDepth.link(slam.depth)
57
58    slam.transform.link(rerunViewer.inputTrans)
59    slam.passthroughRect.link(rerunViewer.inputImg)
60    slam.occupancyGridMap.link(rerunViewer.inputGrid)
61    slam.obstaclePCL.link(rerunViewer.inputObstaclePCL)
62    slam.groundPCL.link(rerunViewer.inputGroundPCL)
63    p.start()
64    while p.isRunning():
65        time.sleep(1)

Need assistance?

Head over to Discussion Forum for technical support or any other questions you might have.