ON THIS PAGE

  • Dynamic Calibration
  • Key capabilities
  • Usage of the Dynamic Calibration Library (DCL)
  • Initializing the DynamicCalibration Node
  • Sending Commands to the Node
  • Receiving Data from the Node
  • Data Structures
  • Reading Coverage Data
  • Reading Calibration Data
  • Performance Modes
  • Examples
  • Dynamic Calibration Interactive Visualizer
  • Calibration Quality check
  • Dynamic Calibration
  • Performance & scenery guidelines
  • Capture a diverse scene
  • Suitable Scenes for Calibration
  • PerformanceMode tuning
  • Best Practices
  • Limitations & notes
  • Troubleshooting
  • See also

Dynamic Calibration

Dynamic Calibration (DCL) is a self-calibration workflow built into DepthAI 3.0 that restores and maintains stereo accuracy when temperature changes, physical shocks, or long-term drift degrade factory calibration.

Key capabilities

  • Restore depth performance—brings the disparity map back to optimal visual quality.
  • Broad compatibility—works on all stereo devices supported by DepthAI 3.0.
  • No targets required—operate in natural scenes; just move the camera to capture a varied view.
  • Rapid execution—typically completes in seconds.
  • Health monitoring—run diagnostics at any time without flashing new calibration.

Usage of the Dynamic Calibration Library (DCL)

This section demonstrates how to use the DynamicCalibration node in DepthAI for dynamic calibration workflows.

Initializing the DynamicCalibration Node

The DynamicCalibration node requires two synchronized camera streams from the same device. Here's how to set it up:
Python
1import depthai as dai
2
3# initialize the pipeline
4pipeline = dai.Pipeline()
5
6# Create camera nodes
7cam_left = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_B)
8cam_right = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_C)
9
10# Initialize the DynamicCalibration node
11dyn_calib = pipeline.create(dai.node.DynamicCalibration)
12
13# Link the cameras to the DynamicCalibration
14left_out.link(dyn_calib.left)
15right_out.link(dyn_calib.right)
16
17calibration = device.readCalibration()
18device.setCalibration(calibration)
19
20pipeline.start()
21while pipeline.isRunning():
22    ....

Sending Commands to the Node

Nodes in DepthAI communicate via input/output message queues. The DynamicCalibration node has several queues, but the most important for control is the inputControl queue.
Python
1# Initialize the command imput queue
2command_input = dyn_calib.inputControl.createInputQueue()
3# Example commandof sending a 
4command_input.send(
5	dai.DynamicCalibrationControl(
6		dai.DynamicCalibrationControl(
7			dai.DynamicCalibrationControl.Commands.StartCalibration()
8        )
9    )
10)
Other Available Commands
  • StartCalibration() - Starts the calibration process.
  • StopCalibration() - Stops the calibration process.
  • Calibrate(force=False) - Computes a new calibration based on the loaded data.
    • force - no restriction on loaded data
  • CalibrationQuality(force=False) - Evaluates the quality of the current calibration.
    • force - no restriction on loaded data
  • LoadImage() - Load one image from the device.
  • ApplyCalibration(calibration) - Apply calibration to the device.
  • SetPerformanceMode(performanceMode) - Send performance mode which will be used.
  • ResetData() - Remove all previously loaded data.

Receiving Data from the Node

The node provides multiple output queues:
  • calibrationOutput → calibration results (DynamicCalibrationResult)
  • coverageOutput → coverage statistics (CoverageData)
  • qualityOutput → calibration quality check (CalibrationQuality)
Python
1# queue for recieving new calibration
2calibration_output = dynCalib.calibrationOutput.createOutputQueue()
3# queue for revieving the coverage
4coverage_output = dynCalib.coverageOutput.createOutputQueue()
5# queue for checking the calibration quality 
6quality_output = dynCalib.qualityOutput.createOutputQueue()

Data Structures

Python
1# Output data structure from coverageOutput
2class CoverageData:
3    coveragePerCellA: np.ndarray  # Coverage per cell [0-1]
4    coveragePerCellB: np.ndarray
5    meanCoverage: float           # Average coverage 0-100 %
6    dataAcquired: float           # % of data acquired
7    coverageAcquired: float       # of the coverage 0-100
8
9# Output of calibration quality checks
10class CalibrationQuality::Data:
11   qualityData:
12    rotationChange: list       # Change in rotation [rx, ry, rz] in degrees
13    depthErrorDifference: list # Change in depth error at [1m, 2m, 5m, 10m] in%
14    sampsonErrorCurrent: float # Mean Sampson error before calibration
15    sampsonErrorNew: float     # Mean Sampson error after calibration
16
17class CalibrationQuality:
18   CalibrationQuality::Data:   # Statistics about calibration change
19   info: str                   # Error or status message
20
21# Output data structure from calibrationOutput
22class DynamicCalibrationResult::Data:
23    newCalibration: dai.CalibrationHandler
24    currentCalibration: dai.CalibrationHandler
25    # difference between the new and current calibrations
26    calibrationDifference: CalibrationQuality::Data  # Error or status message
27
28class DynamicCalibrationResult:
29    calibrationData: DynamicCalibrationResult::Data  # None if unsuccessful
30    info: str                                      # Error or status message

Reading Coverage Data

Coverage data is sent via coverageOutput when an image is loaded, either manually or during background calibration.Manual Image Load
Python
1# Load a single image
2command_input.send(
3	dai.DynamicCalibrationControl(
4		dai.DynamicCalibrationControl(
5			dai.DynamicCalibrationControl.Commands.LoadImage()
6    	)
7    )	
8)
9
10# Get coverage after loading
11coverage = coverage_output.get()
12print(f"Coverage = {coverage.meanCoverage}")
Continuous Collection During Calibration
Python
1command_input.send(
2	dai.DynamicCalibrationControl(
3		dai.DynamicCalibrationControl(
4			dai.DynamicCalibrationControl.Commands.StartCalibration()
5    	)
6    )
7)
8
9
10while pipeline.isRunning():
11    # Blocking read
12    coverage = coverage_output.get()
13    print(f"Coverage = {coverage.meanCoverage}")
14
15    # Non-blocking read
16    coverage = coverage_output.tryGet()
17    if coverage:
18        print(f"Coverage = {coverage.meanCoverage}")

Reading Calibration Data

Calibration results can be obtained from:
  • dai.DynamicCalibrationControl.Commands.StartCalibration() — starts collecting data and attempts calibration.
  • dai.DynamicCalibrationControl.Commands.Calibrate(force=False) — calibrates with existing loaded data.

Performance Modes

The performance mode sets the amount of data needed for the calibration.
Python
1dai.node.DynamicCalibration.PerformanceMode.OPTIMIZE_PERFORMANCE  # The most strict mode
2dai.node.DynamicCalibration.PerformanceMode.DEFAULT               # Less strict but mostly sufficient
3dai.node.DynamicCalibration.PerformanceMode.OPTIMIZE_SPEED        # Optimize speed over performance
4dai.node.DynamicCalibration.PerformanceMode.STATIC_SCENERY        # Not strict
5dai.node.DynamicCalibration.PerformanceMode.SKIP_CHECKS           # Skip all internal checks

Examples

Dynamic Calibration Interactive Visualizer

With the following interactive example you can start or force calibration, load images, check quality, undo or apply changes, and reset or update the system. The interface shows clear progress and results with color bars, summaries of changes, an optional depth view, and a help panel you can turn on or off.
Command Line
1git clone --depth 1 --branch main https://github.com/luxonis/oak-examples.git
2cd oak-examples/dynamic-calibration/
3pip install -r requirements.txt
4python3 main.py

Calibration Quality check

Python
C++

Python

Run this example by following the README on Github.
Python
GitHub
1import depthai as dai
2import numpy as np
3import time
4import cv2
5
6# ---------- Pipeline definition ----------
7with dai.Pipeline() as pipeline:
8    # Create camera nodes
9    monoLeft  = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_B)
10    monoRight = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_C)
11
12    # Request full resolution NV12 outputs
13    monoLeftOut  = monoLeft.requestFullResolutionOutput(dai.ImgFrame.Type.NV12)
14    monoRightOut = monoRight.requestFullResolutionOutput(dai.ImgFrame.Type.NV12)
15
16    # Initialize the DynamicCalibration node
17    dynCalib = pipeline.create(dai.node.DynamicCalibration)
18
19    # Link the cameras to the DynamicCalibration
20    monoLeftOut.link(dynCalib.left)
21    monoRightOut.link(dynCalib.right)
22
23    stereo = pipeline.create(dai.node.StereoDepth)
24    monoLeftOut.link(stereo.left)
25    monoRightOut.link(stereo.right)
26
27    # Queues
28    syncedLeftQueue  = stereo.syncedLeft.createOutputQueue()
29    syncedRightQueue = stereo.syncedRight.createOutputQueue()
30    disparityQueue = stereo.disparity.createOutputQueue()
31
32    # Initialize the command output queues for coverage and calibration quality
33    dynCalibCoverageQueue = dynCalib.coverageOutput.createOutputQueue()
34    dynCalibQualityQueue = dynCalib.qualityOutput.createOutputQueue()
35
36    # Initialize the command input queue
37    dynCalibInputControl = dynCalib.inputControl.createInputQueue()
38
39    device = pipeline.getDefaultDevice()
40    device.setCalibration(device.readCalibration())
41
42    # Setup the colormap for visualization
43    colorMap = cv2.applyColorMap(np.arange(256, dtype=np.uint8), cv2.COLORMAP_JET)
44    colorMap[0] = [0, 0, 0]  # to make zero-disparity pixels black
45    maxDisparity = 1
46
47    pipeline.start()
48    time.sleep(1) # wait for auto exposure to settle
49
50    while pipeline.isRunning():
51        leftSynced  = syncedLeftQueue.get()
52        rightSynced = syncedRightQueue.get()
53        disparity = disparityQueue.get()
54
55        cv2.imshow("left", leftSynced.getCvFrame())
56        cv2.imshow("right", rightSynced.getCvFrame())
57
58        # --- Disparity visualization ---
59        npDisparity = disparity.getFrame()
60        curMax = float(np.max(npDisparity))
61        if curMax > 0:
62            maxDisparity = max(maxDisparity, curMax)
63        normalized = (npDisparity / (maxDisparity if maxDisparity > 0 else 1.0) * 255.0).astype(np.uint8)
64        colorizedDisparity = cv2.applyColorMap(normalized, cv2.COLORMAP_JET)
65        colorizedDisparity[normalized == 0] = (0, 0, 0)
66        cv2.imshow("disparity", colorizedDisparity)
67
68        # --- Load one frame into calibration & read coverage
69        dynCalibInputControl.send(dai.DynamicCalibrationControl(dai.DynamicCalibrationControl.Commands.LoadImage()))
70        coverage = dynCalibCoverageQueue.get()
71        if coverage is not None:
72            print(f"2D Spatial Coverage = {coverage.meanCoverage} / 100 [%]")
73            print(f"Data Acquired       = {coverage.dataAcquired} / 100 [%]")
74
75        # --- Request a quality evaluation & read result
76        dynCalibInputControl.send(dai.DynamicCalibrationControl(dai.DynamicCalibrationControl.Commands.CalibrationQuality(False)))
77        dynQualityResult = dynCalibQualityQueue.get()
78        if dynQualityResult is not None:
79            print(f"Dynamic calibration status: {dynQualityResult.info}")
80
81            # If the calibration is successfully returned apply it to the device
82            if dynQualityResult.qualityData:
83                q = dynQualityResult.qualityData
84                print("Successfully evaluated Quality")
85                rotDiff = float(np.sqrt(q.rotationChange[0]**2 +
86                                        q.rotationChange[1]**2 +
87                                        q.rotationChange[2]**2))
88                print(f"Rotation difference: || r_current - r_new || = {rotDiff:.2f} deg")
89                print(f"Mean Sampson error achievable = {q.sampsonErrorNew:.3f} px")
90                print(f"Mean Sampson error current    = {q.sampsonErrorCurrent:.3f} px")
91                print(
92                    "Theoretical Depth Error Difference "
93                    f"@1m:{q.depthErrorDifference[0]:.2f}%, "
94                    f"2m:{q.depthErrorDifference[1]:.2f}%, "
95                    f"5m:{q.depthErrorDifference[2]:.2f}%, "
96                    f"10m:{q.depthErrorDifference[3]:.2f}%"
97                )
98                # Reset temporary accumulators before the next cycle
99                dynCalibInputControl.send(dai.DynamicCalibrationControl(dai.DynamicCalibrationControl.Commands.ResetData()))
100
101        key = cv2.waitKey(1)
102        if key == ord('q'):
103            pipeline.stop()
104            break

Dynamic Calibration

Python
C++

Python

Run this example by following the README on Github.
Python
GitHub
1import depthai as dai
2import numpy as np
3import time
4import cv2
5
6# ---------- Pipeline definition ----------
7with dai.Pipeline() as pipeline:
8    # Cameras
9    monoLeft  = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_B)
10    monoRight = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_C)
11
12    # Full-res NV12 outputs
13    monoLeftOut  = monoLeft.requestFullResolutionOutput(dai.ImgFrame.Type.NV12)
14    monoRightOut = monoRight.requestFullResolutionOutput(dai.ImgFrame.Type.NV12)
15
16    # Initialize the DynamicCalibration node
17    dynCalib = pipeline.create(dai.node.DynamicCalibration)
18
19    # Link the cameras to the DynamicCalibration
20    monoLeftOut.link(dynCalib.left)
21    monoRightOut.link(dynCalib.right)
22
23    # Stereo (for disparity + synced previews)
24    stereo = pipeline.create(dai.node.StereoDepth)
25    monoLeftOut.link(stereo.left)
26    monoRightOut.link(stereo.right)
27
28    # Output queues
29    syncedLeftQueue  = stereo.syncedLeft.createOutputQueue()
30    syncedRightQueue = stereo.syncedRight.createOutputQueue()
31    disparityQueue   = stereo.disparity.createOutputQueue()
32
33    # Initialize the command output queues for calibration and coverage
34    dynCalibCalibrationQueue = dynCalib.calibrationOutput.createOutputQueue()
35    dynCalibCoverageQueue    = dynCalib.coverageOutput.createOutputQueue()
36
37    # Initialize the command input queue
38    dynCalibInputControl = dynCalib.inputControl.createInputQueue()
39
40    device = pipeline.getDefaultDevice()
41    device.setCalibration(device.readCalibration())
42
43    # Setup the colormap for visualization
44    colorMap = cv2.applyColorMap(np.arange(256, dtype=np.uint8), cv2.COLORMAP_JET)
45    colorMap[0] = [0, 0, 0]  # to make zero-disparity pixels black
46    maxDisparity = 1.0
47
48    pipeline.start()
49    time.sleep(1) # wait for auto exposure to settle
50
51    # Set performance mode
52    dynCalibInputControl.send(
53        dai.DynamicCalibrationControl(dai.DynamicCalibrationControl.Commands.SetPerformanceMode(
54            dai.node.DynamicCalibration.OPTIMIZE_PERFORMANCE)
55        )
56    )
57
58    # Start periodic calibration
59    dynCalibInputControl.send(
60        dai.DynamicCalibrationControl(dai.DynamicCalibrationControl.Commands.StartCalibration())
61    )
62
63    while pipeline.isRunning():
64        leftSynced  = syncedLeftQueue.get()
65        rightSynced = syncedRightQueue.get()
66        disparity = disparityQueue.get()
67
68        cv2.imshow("left", leftSynced.getCvFrame())
69        cv2.imshow("right", rightSynced.getCvFrame())
70
71        # --- Disparity visualization ---
72        npDisparity = disparity.getFrame()
73        curMax = float(np.max(npDisparity))
74        if curMax > 0:
75            maxDisparity = max(maxDisparity, curMax)
76
77        # Normalize to [0,255] and colorize; keep zero-disparity as black
78        denom = maxDisparity if maxDisparity > 0 else 1.0
79        normalized = (npDisparity / denom * 255.0).astype(np.uint8)
80        colorizedDisparity = cv2.applyColorMap(normalized, cv2.COLORMAP_JET)
81        colorizedDisparity[normalized == 0] = (0, 0, 0)
82        cv2.imshow("disparity", colorizedDisparity)
83
84        # --- Coverage (non-blocking) ---
85        coverage = dynCalibCoverageQueue.tryGet()
86        if coverage is not None:
87            print(f"2D Spatial Coverage = {coverage.meanCoverage} / 100 [%]")
88            print(f"Data Acquired       = {coverage.dataAcquired} / 100 [%]")
89
90        # --- Calibration result (non-blocking) ---
91        dynCalibrationResult = dynCalibCalibrationQueue.tryGet()
92        calibrationData = dynCalibrationResult.calibrationData if dynCalibrationResult is not None else None
93
94        if dynCalibrationResult is not None:
95            print(f"Dynamic calibration status: {dynCalibrationResult.info}")
96
97        # --- Apply calibration if available, print quality deltas, then reset+continue ---
98        if calibrationData:
99            print("Successfully calibrated")
100            # Apply to device
101            dynCalibInputControl.send(
102                dai.DynamicCalibrationControl(
103                    dai.DynamicCalibrationControl.Commands.ApplyCalibration(calibrationData.newCalibration)
104                )
105            )
106
107            q = calibrationData.calibrationDifference
108            rotDiff = float(np.sqrt(q.rotationChange[0]**2 +
109                                    q.rotationChange[1]**2 +
110                                    q.rotationChange[2]**2))
111            print(f"Rotation difference: || r_current - r_new || = {rotDiff:.2f} deg")
112            print(f"Mean Sampson error achievable = {q.sampsonErrorNew:.3f} px")
113            print(f"Mean Sampson error current    = {q.sampsonErrorCurrent:.3f} px")
114            print("Theoretical Depth Error Difference "
115                  f"@1m:{q.depthErrorDifference[0]:.2f}%, "
116                  f"2m:{q.depthErrorDifference[1]:.2f}%, "
117                  f"5m:{q.depthErrorDifference[2]:.2f}%, "
118                  f"10m:{q.depthErrorDifference[3]:.2f}%")
119
120            # Reset accumulators and continue periodic calibration
121            dynCalibInputControl.send(
122                dai.DynamicCalibrationControl(dai.DynamicCalibrationControl.Commands.ResetData())
123            )
124            dynCalibInputControl.send(
125                dai.DynamicCalibrationControl(dai.DynamicCalibrationControl.Commands.StartCalibration())
126            )
127
128        key = cv2.waitKey(1)
129        if key == ord('q'):
130            pipeline.stop()
131            break
For more information please follow the example's README.

Performance & scenery guidelines

Capture a diverse scene

  • Include textured objects at multiple depths.
  • Avoid blank walls or featureless surfaces.
  • Slowly move the camera to cover the full FOV; avoid sudden motions.

Suitable Scenes for Calibration

Good calibration scenes make it easier for algorithms to detect, match, and track features. Recommended characteristics:
  • Rich texture: Surfaces with varied colors/details (e.g., brick, bark, bookshelves).
  • Matte, opaque materials: Non-reflective surfaces like wood, paper, fabric, stone.
  • Unique patterns: Irregular, distinctive objects—avoid repetitive structures.
  • Even lighting: Diffuse daylight or soft ambient light to reduce shadows/glare.
  • Structured details: Edges, corners, and small landmarks spread across depths.

PerformanceMode tuning

ModeWhen to use
DEFAULTBalanced accuracy vs. speed.
STATIC_SCENERYCamera is fixed, scene stable.
OPTIMIZE_SPEEDFastest calibration, reduced precision.
OPTIMIZE_PERFORMANCEMaximum precision in feature-rich scenes.
SKIP_CHECKSAutomated pipelines, were internal check to guarantee scene quality are ignored.

Best Practices

For high-accuracy calibration:
  • Use OPTIMIZE_PERFORMANCE with a dynamic, well-featured scene.
  • Use STATIC_SCENERY if the camera is fixed and viewing a stable, structured environment.
  • Use SKIP_CHECKS only in automated workflows where scenery quality is externally validated.
By combining appropriate scenery with the correct PerformanceMode, users can significantly improve calibration reliability and depth estimation quality.

Limitations & notes

  • Supported devices — Dynamic Calibration is available for:
    • All stereo OAK Series 2 cameras (excluding FFC)
    • All stereo OAK Series 4 cameras
  • DepthAI version — requires DepthAI 3.0 or later.
  • Re-calibrated parameters — updates extrinsics only; intrinsics remain unchanged.
  • OS support — Available on Linux, MacOS and Windows.
  • Absolute depth spec — DCL improves relative depth perception; absolute accuracy may still differ slightly from the original factory spec.

Troubleshooting

SymptomPossible causeFix
High reprojection errorIncorrect model name or HFOV in board configVerify board JSON and camera specs
Depth still incorrect after “successful” DCLLeft / right cameras swappedSwap sockets or update board config and recalibrate
nullopt quality reportInsufficient scene coverageMove camera to capture richer textures
Runtime error: "The calibration on the device is too old to perform DynamicCalibration, full re-calibration required!"The device calibration is too outdated for dynamic recalibration to provide any benefit.A newer device is needed

See also

Manual stereo and ToF calibration guide