ON THIS PAGE

  • Key capabilities
  • Automatic calibration options
  • Usage of the Dynamic Calibration Library (DCL)
  • Examples
  • Scenery guidelines
  • PerformanceMode tuning
  • Limitations & notes
  • Troubleshooting
  • See also

DynamicCalibration - technical implementation

DynamicCalibration is a self-calibration workflow built into DepthAI 3.0 that restores and maintains stereo accuracy when temperature changes, physical shocks, or long-term drift degrade factory calibration.
On this page we layout the steps for technical implementation of the DynamicCalibration into your projects. If you want to know more about general information about DynamicCalibration, visit this page.

Key capabilities

  • Restore depth performanceβ€”brings the disparity map back to optimal visual quality.
  • No targets requiredβ€”operate in natural scenes; just move the camera to capture a varied view.
  • Rapid executionβ€”typically completes in seconds.
  • Health monitoringβ€”run diagnostics at any time without flashing new calibration.

Automatic calibration options

There are two ways to run automatic calibration on top of DynamicCalibration:
  • Use the AutoCalibration Host Node for explicit node-level control inside your pipeline.
  • Use DEPTHAI_AUTOCALIBRATION for deployment-time enablement without changing pipeline code.
The detailed behavior and deployment guidance for both options is documented on the AutoCalibration page.

Usage of the Dynamic Calibration Library (DCL)

This section is a simple high-level representation of how to use the DynamicCalibration node in DepthAI for dynamic calibration workflows.
Dynamic calibration takes as input a DynamicCalibrationControl message with a command (eg. ApplyCalibration, Calibrate, ... see all available commands in the message definition). The node returns an output in one of the three output queues:depending on which input command was sent.In the next paragraphs we are going to explore how to actually implement DCL into your code.

Initializing the DynamicCalibration Node

The DynamicCalibration node requires two synchronized camera streams from the same device. Here's how to set it up:
Python
1import depthai as dai
2
3# initialize the pipeline
4pipeline = dai.Pipeline()
5
6# Create camera nodes
7cam_left = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_B)
8cam_right = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_C)
9
10# Request full resolution NV12 outputs
11left_out = cam_left.requestFullResolutionOutput()
12right_out = cam_right.requestFullResolutionOutput()
13
14# Initialize the DynamicCalibration node
15dyn_calib = pipeline.create(dai.node.DynamicCalibration)
16
17# Link the cameras to the DynamicCalibration
18left_out.link(dyn_calib.left)
19right_out.link(dyn_calib.right)
20
21device = pipeline.getDefaultDevice()
22calibration = device.readCalibration()
23device.setCalibration(calibration)
24
25pipeline.start()
26while pipeline.isRunning():
27    ...

Sending Commands to the Node

Nodes in DepthAI communicate via input/output message queues. The DynamicCalibration node has several queues, but the most important for control is the inputControl queue.
Python
1# Initialize the command input queue
2command_input = dyn_calib.inputControl.createInputQueue()
3# Example command of sending a 
4command_input.send(dai.DynamicCalibrationControl.startCalibration())
Available Commands
  • StartCalibration() - Starts the calibration process.
  • StopCalibration() - Stops the calibration process.
  • Calibrate(force=False) - Computes a new calibration based on the loaded data.
    • force - no restriction on loaded data
  • CalibrationQuality(force=False) - Evaluates the quality of the current calibration.
    • force - no restriction on loaded data
  • LoadImage() - Load one image from the device.
  • ComputeCalibrationMetrics(calibration) - Compute calibration metrics as dataQuality and calibrationConfidence.
  • ApplyCalibration(calibration) - Apply calibration to the device.
  • SetPerformanceMode(performanceMode) - Send performance mode which will be used.
  • ResetData() - Remove all previously loaded data.

Receiving Data from the Node

The node provides multiple output queues:
Python
1# queue for recieving new calibration
2calibration_output = dyn_calib.calibrationOutput.createOutputQueue()
3# queue for revieving the coverage
4coverage_output = dyn_calib.coverageOutput.createOutputQueue()
5# queue for checking the calibration quality 
6quality_output = dyn_calib.qualityOutput.createOutputQueue()
7# queue for checking the calibration metrics
8metrics_output = dyn_calib.metricsOutput.createOutputQueue()
Check references of DynamicCalibrationResult, CoverageData, CalibrationQuality and CalibrationMetrics to see exact data structures of the outputs.

Reading Coverage Data

Coverage data is received via coverageOutput when an image is loaded either manually (with LoadImage command) or during continuous calibration (after running StartCalibration command).Manual Image Load
Python
1# Load a single image
2command_input.send(dai.DynamicCalibrationControl.loadImage())
3
4# Get coverage after loading
5coverage = coverage_output.get()
6print(f"Coverage = {coverage.meanCoverage}")
Continuous Collection During Calibration
Python
1command_input.send(dai.DynamicCalibrationControl.startCalibration())
2
3while pipeline.isRunning():
4    # Blocking read
5    coverage = coverage_output.get()
6    print(f"Coverage = {coverage.meanCoverage}")
7
8    # Non-blocking read
9    coverage = coverage_output.tryGet()
10    if coverage:
11        print(f"Coverage = {coverage.meanCoverage}")

Reading Calibration Data

Calibration results can be obtained from:
  • dai.DynamicCalibrationControl.startCalibration() β€” starts collecting data and attempts calibration.
  • dai.DynamicCalibrationControl.calibrate(force=False) β€” calibrates with existing loaded data (image must be loaded beforehand with LoadImage command as can be seen in the example below).
Calibration data will be returned as DynamicCalibrationResult message type.Manual Image Load
Python
1# Load a single image
2command_inputsend.(dai.DynamicCalibrationControl.loadImage())
3
4# Send a command to calibrate
5command_input.send(dai.DynamicCalibrationControl.calibrate(force=False))
6
7# Get calibration after loading
8calibration = calibration_output.get()
9print(f"Calibration = {calibration.info}")
Continuous Collection
Python
1# Starts collecting data and attempts calibration
2command_input.send(dai.DynamicCalibrationControl.startCalibration())
3
4while pipeline.isRunning():
5    # Blocking read
6    calibration = calibration_output.get()
7    print(f"Calibration = {calibration.info}")
8
9    # Non-blocking read
10    calibration = calibration_output.tryGet()
11    if calibration:
12        print(f"Calibration = {calibration.info}")

Performance Modes

Set the performance mode with
Python
1# Set performance mode
2dynCalibInputControl.send(dai.DynamicCalibrationControl.setPerformanceMode(dai.node.DynamicCalibration.OPTIMIZE_PERFORMANCE))
The performance mode sets the amount of data needed for the calibration.
Python
1dai.node.DynamicCalibration.PerformanceMode.OPTIMIZE_PERFORMANCE  # The most strict mode
2dai.node.DynamicCalibration.PerformanceMode.DEFAULT               # Less strict but mostly sufficient
3dai.node.DynamicCalibration.PerformanceMode.OPTIMIZE_SPEED        # Optimize speed over performance
4dai.node.DynamicCalibration.PerformanceMode.STATIC_SCENERY        # Not strict
5dai.node.DynamicCalibration.PerformanceMode.SKIP_CHECKS           # Skip all internal checks

Examples

Dynamic Calibration Interactive Visualizer

With the following commands you can clone and run the calibration integration.
Command Line
1git clone https://github.com/luxonis/depthai-core.git
2cd depthai-core/
3python3 -m venv venv
4source venv/bin/activate
5python3 examples/python/install_requirements.py
6python3 examples/python/DynamicCalibration/calibration_integration.py

Calibration Quality Check

Python

Run this example by following the README on Github.
Python
GitHub
1import depthai as dai
2import numpy as np
3import time
4import cv2
5
6# ---------- Pipeline definition ----------
7with dai.Pipeline() as pipeline:
8    # Create camera nodes
9    monoLeft  = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_B)
10    monoRight = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_C)
11
12    # Request full resolution NV12 outputs
13    monoLeftOut  = monoLeft.requestFullResolutionOutput()
14    monoRightOut = monoRight.requestFullResolutionOutput()
15
16    # Initialize the DynamicCalibration node
17    dynCalib = pipeline.create(dai.node.DynamicCalibration)
18
19    # Link the cameras to the DynamicCalibration
20    monoLeftOut.link(dynCalib.left)
21    monoRightOut.link(dynCalib.right)
22
23    stereo = pipeline.create(dai.node.StereoDepth)
24    monoLeftOut.link(stereo.left)
25    monoRightOut.link(stereo.right)
26
27    # Queues
28    syncedLeftQueue  = stereo.syncedLeft.createOutputQueue()
29    syncedRightQueue = stereo.syncedRight.createOutputQueue()
30    disparityQueue = stereo.disparity.createOutputQueue()
31
32    # Initialize the command output queues for coverage and calibration quality
33    dynCalibCoverageQueue = dynCalib.coverageOutput.createOutputQueue()
34    dynCalibQualityQueue = dynCalib.qualityOutput.createOutputQueue()
35
36    # Initialize the command input queue
37    dynCalibInputControl = dynCalib.inputControl.createInputQueue()
38
39    device = pipeline.getDefaultDevice()
40    device.setCalibration(device.readCalibration())
41
42    # Setup the colormap for visualization
43    colorMap = cv2.applyColorMap(np.arange(256, dtype=np.uint8), cv2.COLORMAP_JET)
44    colorMap[0] = [0, 0, 0]  # to make zero-disparity pixels black
45    maxDisparity = 1
46
47    pipeline.start()
48    time.sleep(1) # wait for auto exposure to settle
49
50    while pipeline.isRunning():
51        leftSynced  = syncedLeftQueue.get()
52        rightSynced = syncedRightQueue.get()
53        disparity = disparityQueue.get()
54
55        cv2.imshow("left", leftSynced.getCvFrame())
56        cv2.imshow("right", rightSynced.getCvFrame())
57
58        # --- Disparity visualization ---
59        npDisparity = disparity.getFrame()
60        curMax = float(np.max(npDisparity))
61        if curMax > 0:
62            maxDisparity = max(maxDisparity, curMax)
63        normalized = (npDisparity / (maxDisparity if maxDisparity > 0 else 1.0) * 255.0).astype(np.uint8)
64        colorizedDisparity = cv2.applyColorMap(normalized, cv2.COLORMAP_JET)
65        colorizedDisparity[normalized == 0] = (0, 0, 0)
66        cv2.imshow("disparity", colorizedDisparity)
67
68        # --- Load one frame into calibration & read coverage
69        dynCalibInputControl.send(dai.DynamicCalibrationControl.loadImage())
70        coverage = dynCalibCoverageQueue.get()
71        if coverage is not None:
72            print(f"2D Spatial Coverage = {coverage.meanCoverage} / 100 [%]")
73            print(f"Data Acquired       = {coverage.dataAcquired} / 100 [%]")
74
75        # --- Request a quality evaluation & read result
76        dynCalibInputControl.send(dai.DynamicCalibrationControl.calibrationQuality(False))
77        dynQualityResult = dynCalibQualityQueue.get()
78        if dynQualityResult is not None:
79            print(f"Dynamic calibration status: {dynQualityResult.info}")
80
81            # If the calibration is successfully returned apply it to the device
82            if dynQualityResult.qualityData:
83                q = dynQualityResult.qualityData
84                print("Successfully evaluated Quality")
85                rotDiff = float(np.sqrt(q.rotationChange[0]**2 +
86                                        q.rotationChange[1]**2 +
87                                        q.rotationChange[2]**2))
88                print(f"Rotation difference: || r_current - r_new || = {rotDiff:.2f} deg")
89                print(f"Mean Sampson error achievable = {q.sampsonErrorNew:.3f} px")
90                print(f"Mean Sampson error current    = {q.sampsonErrorCurrent:.3f} px")
91                print(
92                    "Theoretical Depth Error Difference "
93                    f"@1m:{q.depthErrorDifference[0]:.2f}%, "
94                    f"2m:{q.depthErrorDifference[1]:.2f}%, "
95                    f"5m:{q.depthErrorDifference[2]:.2f}%, "
96                    f"10m:{q.depthErrorDifference[3]:.2f}%"
97                )
98                # Reset temporary accumulators before the next cycle
99                dynCalibInputControl.send(dai.DynamicCalibrationControl.resetData())
100
101        key = cv2.waitKey(1)
102        if key == ord('q'):
103            pipeline.stop()
104            break

C++

Example from our Github:
1#include <chrono>
2#include <cmath>
3#include <iomanip>
4#include <iostream>
5#include <opencv2/opencv.hpp>
6#include <thread>
7
8#include "depthai/depthai.hpp"
9
10int main() {
11    auto device = std::make_shared<dai::Device>();
12
13    // ---------- Pipeline definition ----------
14    dai::Pipeline pipeline(device);
15
16    auto monoLeft = pipeline.create<dai::node::Camera>()->build(dai::CameraBoardSocket::CAM_B);
17    auto monoRight = pipeline.create<dai::node::Camera>()->build(dai::CameraBoardSocket::CAM_C);
18
19    auto* leftOut = monoLeft->requestFullResolutionOutput();
20    auto* rightOut = monoRight->requestFullResolutionOutput();
21
22    // Dynamic-calibration node
23    auto dynCalib = pipeline.create<dai::node::DynamicCalibration>();
24    leftOut->link(dynCalib->left);
25    rightOut->link(dynCalib->right);
26
27    auto stereo = pipeline.create<dai::node::StereoDepth>();
28    leftOut->link(stereo->left);
29    rightOut->link(stereo->right);
30
31    // In-pipeline host queues
32    auto leftSyncedQueue = stereo->syncedLeft.createOutputQueue();
33    auto rightSyncedQueue = stereo->syncedRight.createOutputQueue();
34    auto disparityQueue = stereo->disparity.createOutputQueue();
35
36    auto dynQualityOutQ = dynCalib->qualityOutput.createOutputQueue();
37    auto dynCoverageOutQ = dynCalib->coverageOutput.createOutputQueue();
38    auto dynCalibInputControl = dynCalib->inputControl.createInputQueue();
39
40    device->setCalibration(device->readCalibration());
41
42    pipeline.start();
43    std::this_thread::sleep_for(std::chrono::seconds(1));  // wait for autoexposure to settle
44
45    using DCC = dai::DynamicCalibrationControl;
46
47    while(pipeline.isRunning()) {
48        auto leftSynced = leftSyncedQueue->get<dai::ImgFrame>();
49        auto rightSynced = rightSyncedQueue->get<dai::ImgFrame>();
50        auto disparity = disparityQueue->get<dai::ImgFrame>();
51
52        cv::imshow("left", leftSynced->getCvFrame());
53        cv::imshow("right", rightSynced->getCvFrame());
54
55        // --- Load one frame pair into the calibration pipeline
56        dynCalibInputControl->send(DCC::loadImage());
57
58        // Wait for coverage info
59        auto coverageMsg = dynCoverageOutQ->get<dai::CoverageData>();
60        if(coverageMsg) {
61            std::cout << "2D Spatial Coverage = " << coverageMsg->meanCoverage << " / 100 [%]" << std::endl;
62            std::cout << "Data Acquired       = " << coverageMsg->dataAcquired << " / 100 [%]" << std::endl;
63        }
64
65        // Request a calibration quality evaluation (non-forced)
66        dynCalibInputControl->send(DCC::calibrationQuality(false));
67
68        // Wait for calibration result
69        auto dynCalibrationResult = dynQualityOutQ->get<dai::CalibrationQuality>();
70        if(dynCalibrationResult) {
71            std::cout << "Dynamic calibration status: " << dynCalibrationResult->info << std::endl;
72
73            if(dynCalibrationResult->qualityData) {
74                std::cout << "Successfully evaluated Quality." << std::endl;
75
76                const auto& q = *dynCalibrationResult->qualityData;
77
78                // --- Rotation difference magnitude (degrees) ---
79                float rotDiff = std::sqrt(q.rotationChange[0] * q.rotationChange[0] + q.rotationChange[1] * q.rotationChange[1]
80                                          + q.rotationChange[2] * q.rotationChange[2]);
81                std::cout << "Rotation difference: || r_current - r_new || = " << rotDiff << " deg" << std::endl;
82
83                // --- Sampson error (px) ---
84                std::cout << "Mean Sampson error achievable = " << q.sampsonErrorNew << " px" << std::endl;
85                std::cout << "Mean Sampson error current    = " << q.sampsonErrorCurrent << " px" << std::endl;
86
87                // --- Depth error difference (%) at 1/2/5/10 m ---
88                std::cout << "Theoretical Depth Error Difference " << "@1m:" << std::fixed << std::setprecision(2) << q.depthErrorDifference[0] << "%, "
89                          << "2m:" << q.depthErrorDifference[1] << "%, " << "5m:" << q.depthErrorDifference[2] << "%, " << "10m:" << q.depthErrorDifference[3]
90                          << "%" << std::endl;
91
92                // (Optional) Trigger a calibration step if desired:
93                // dynCalibInputControl->send(DCC::calibrate(true));
94
95                // Reset temporary data after reading metrics
96                dynCalibInputControl->send(DCC::resetData());
97            }
98        } else {
99            std::cout << "Dynamic calibration: no result received." << std::endl;
100        }
101
102        int key = cv::waitKey(1);
103        if(key == 'q') break;
104    }
105
106    return 0;
107}

Dynamic Calibration

Python

Run this example by following the README on Github.
Python
GitHub
1import depthai as dai
2import numpy as np
3import time
4import cv2
5
6# ---------- Pipeline definition ----------
7with dai.Pipeline() as pipeline:
8    # Cameras
9    monoLeft  = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_B)
10    monoRight = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_C)
11
12    # Full-res NV12 outputs
13    monoLeftOut  = monoLeft.requestFullResolutionOutput()
14    monoRightOut = monoRight.requestFullResolutionOutput()
15
16    # Initialize the DynamicCalibration node
17    dynCalib = pipeline.create(dai.node.DynamicCalibration)
18
19    # Link the cameras to the DynamicCalibration
20    monoLeftOut.link(dynCalib.left)
21    monoRightOut.link(dynCalib.right)
22
23    # Stereo (for disparity + synced previews)
24    stereo = pipeline.create(dai.node.StereoDepth)
25    monoLeftOut.link(stereo.left)
26    monoRightOut.link(stereo.right)
27
28    # Output queues
29    syncedLeftQueue  = stereo.syncedLeft.createOutputQueue()
30    syncedRightQueue = stereo.syncedRight.createOutputQueue()
31    disparityQueue   = stereo.disparity.createOutputQueue()
32
33    # Initialize the command output queues for calibration and coverage
34    dynCalibCalibrationQueue = dynCalib.calibrationOutput.createOutputQueue()
35    dynCalibCoverageQueue    = dynCalib.coverageOutput.createOutputQueue()
36
37    # Initialize the command input queue
38    dynCalibInputControl = dynCalib.inputControl.createInputQueue()
39
40    device = pipeline.getDefaultDevice()
41    device.setCalibration(device.readCalibration())
42
43    # Setup the colormap for visualization
44    colorMap = cv2.applyColorMap(np.arange(256, dtype=np.uint8), cv2.COLORMAP_JET)
45    colorMap[0] = [0, 0, 0]  # to make zero-disparity pixels black
46    maxDisparity = 1.0
47
48    pipeline.start()
49    time.sleep(1) # wait for auto exposure to settle
50
51    # Set performance mode
52    dynCalibInputControl.send(
53        dai.DynamicCalibrationControl.setPerformanceMode(
54            dai.DynamicCalibrationControl.OPTIMIZE_PERFORMANCE
55        )
56    )
57
58    # Start periodic calibration
59    dynCalibInputControl.send(
60        dai.DynamicCalibrationControl.startCalibration()
61    )
62
63    while pipeline.isRunning():
64        leftSynced  = syncedLeftQueue.get()
65        rightSynced = syncedRightQueue.get()
66        disparity = disparityQueue.get()
67
68        cv2.imshow("left", leftSynced.getCvFrame())
69        cv2.imshow("right", rightSynced.getCvFrame())
70
71        # --- Disparity visualization ---
72        npDisparity = disparity.getFrame()
73        curMax = float(np.max(npDisparity))
74        if curMax > 0:
75            maxDisparity = max(maxDisparity, curMax)
76
77        # Normalize to [0,255] and colorize; keep zero-disparity as black
78        denom = maxDisparity if maxDisparity > 0 else 1.0
79        normalized = (npDisparity / denom * 255.0).astype(np.uint8)
80        colorizedDisparity = cv2.applyColorMap(normalized, cv2.COLORMAP_JET)
81        colorizedDisparity[normalized == 0] = (0, 0, 0)
82        cv2.imshow("disparity", colorizedDisparity)
83
84        # --- Coverage (non-blocking) ---
85        coverage = dynCalibCoverageQueue.tryGet()
86        if coverage is not None:
87            print(f"2D Spatial Coverage = {coverage.meanCoverage} / 100 [%]")
88            print(f"Data Acquired       = {coverage.dataAcquired} / 100 [%]")
89
90        # --- Calibration result (non-blocking) ---
91        dynCalibrationResult = dynCalibCalibrationQueue.tryGet()
92        calibrationData = dynCalibrationResult.calibrationData if dynCalibrationResult is not None else None
93
94        if dynCalibrationResult is not None:
95            print(f"Dynamic calibration status: {dynCalibrationResult.info}")
96
97        # --- Apply calibration if available, print quality deltas, then reset+continue ---
98        if calibrationData:
99            print("Successfully calibrated")
100            # Apply to device
101            dynCalibInputControl.send(
102                dai.DynamicCalibrationControl.applyCalibration(calibrationData.newCalibration)
103            )
104
105            q = calibrationData.calibrationDifference
106            rotDiff = float(np.sqrt(q.rotationChange[0]**2 +
107                                    q.rotationChange[1]**2 +
108                                    q.rotationChange[2]**2))
109            print(f"Rotation difference: || r_current - r_new || = {rotDiff:.2f} deg")
110            print(f"Mean Sampson error achievable = {q.sampsonErrorNew:.3f} px")
111            print(f"Mean Sampson error current    = {q.sampsonErrorCurrent:.3f} px")
112            print("Theoretical Depth Error Difference "
113                  f"@1m:{q.depthErrorDifference[0]:.2f}%, "
114                  f"2m:{q.depthErrorDifference[1]:.2f}%, "
115                  f"5m:{q.depthErrorDifference[2]:.2f}%, "
116                  f"10m:{q.depthErrorDifference[3]:.2f}%")
117
118            # Reset accumulators and continue periodic calibration
119            dynCalibInputControl.send(
120                dai.DynamicCalibrationControl.resetData()
121            )
122            dynCalibInputControl.send(
123                dai.DynamicCalibrationControl.startCalibration()
124            )
125
126        key = cv2.waitKey(1)
127        if key == ord('q'):
128            pipeline.stop()
129            break

C++

Example from our Github:
1// examples/cpp/DynamicCalibration/calibrate.cpp
2#include <algorithm>
3#include <chrono>
4#include <cmath>
5#include <iomanip>
6#include <iostream>
7#include <opencv2/opencv.hpp>
8#include <thread>
9
10#include "depthai/depthai.hpp"
11
12int main() {
13    auto device = std::make_shared<dai::Device>();
14
15    // ---------- Pipeline definition ----------
16    dai::Pipeline pipeline(device);
17
18    auto monoLeft = pipeline.create<dai::node::Camera>()->build(dai::CameraBoardSocket::CAM_B);
19    auto monoRight = pipeline.create<dai::node::Camera>()->build(dai::CameraBoardSocket::CAM_C);
20
21    auto* leftOut = monoLeft->requestFullResolutionOutput();
22    auto* rightOut = monoRight->requestFullResolutionOutput();
23
24    // Dynamic-calibration node
25    auto dynCalib = pipeline.create<dai::node::DynamicCalibration>();
26    leftOut->link(dynCalib->left);
27    rightOut->link(dynCalib->right);
28
29    auto stereo = pipeline.create<dai::node::StereoDepth>();
30    leftOut->link(stereo->left);
31    rightOut->link(stereo->right);
32
33    // In-pipeline host queues
34    auto leftSyncedQueue = stereo->syncedLeft.createOutputQueue();
35    auto rightSyncedQueue = stereo->syncedRight.createOutputQueue();
36    auto disparityQueue = stereo->disparity.createOutputQueue();
37
38    auto dynCalibOutQ = dynCalib->calibrationOutput.createOutputQueue();
39    auto dynCoverageOutQ = dynCalib->coverageOutput.createOutputQueue();
40
41    auto dynCalibInputControl = dynCalib->inputControl.createInputQueue();
42
43    device->setCalibration(device->readCalibration());
44
45    pipeline.start();
46    std::this_thread::sleep_for(std::chrono::seconds(1));  // wait for autoexposure to settle
47
48    using DCC = dai::DynamicCalibrationControl;
49    // Optionally set performance mode:
50    dynCalibInputControl->send(DCC::setPerformanceMode(DCC::PerformanceMode::OPTIMIZE_PERFORMANCE));
51
52    // Start calibration (optimize performance)
53    dynCalibInputControl->send(DCC::startCalibration());
54
55    double maxDisparity = 1.0;
56    while(pipeline.isRunning()) {
57        auto leftSynced = leftSyncedQueue->get<dai::ImgFrame>();
58        auto rightSynced = rightSyncedQueue->get<dai::ImgFrame>();
59        auto disparity = disparityQueue->get<dai::ImgFrame>();
60
61        cv::imshow("left", leftSynced->getCvFrame());
62        cv::imshow("right", rightSynced->getCvFrame());
63
64        cv::Mat npDisparity = disparity->getFrame();
65
66        double minVal = 0.0, curMax = 0.0;
67        cv::minMaxLoc(npDisparity, &minVal, &curMax);
68        maxDisparity = std::max(maxDisparity, curMax);
69
70        // Normalize the disparity image to an 8-bit scale.
71        cv::Mat normalized;
72        npDisparity.convertTo(normalized, CV_8UC1, 255.0 / (maxDisparity > 0 ? maxDisparity : 1.0));
73
74        cv::Mat colorizedDisparity;
75        cv::applyColorMap(normalized, colorizedDisparity, cv::COLORMAP_JET);
76
77        // Set pixels with zero disparity to black.
78        colorizedDisparity.setTo(cv::Scalar(0, 0, 0), normalized == 0);
79
80        cv::imshow("disparity", colorizedDisparity);
81
82        // Coverage (non-blocking)
83        if(auto coverageMsg = dynCoverageOutQ->tryGet<dai::CoverageData>()) {
84            std::cout << "2D Spatial Coverage = " << coverageMsg->meanCoverage << "  / 100 [%]\n";
85            std::cout << "Data Acquired       = " << coverageMsg->dataAcquired << "  / 100 [%]\n";
86        }
87
88        // Calibration result (non-blocking)
89        if(auto dynCalibrationResult = dynCalibOutQ->tryGet<dai::DynamicCalibrationResult>()) {
90            std::cout << "Dynamic calibration status: " << dynCalibrationResult->info << std::endl;
91
92            if(dynCalibrationResult->calibrationData) {
93                std::cout << "Successfully calibrated." << std::endl;
94
95                // Apply the produced calibration
96                const auto& newCalib = dynCalibrationResult->calibrationData->newCalibration;
97                dynCalibInputControl->send(DCC::applyCalibration(newCalib));
98
99                // Print quality deltas
100                const auto& q = dynCalibrationResult->calibrationData->calibrationDifference;
101
102                float rotDiff = std::sqrt(q.rotationChange[0] * q.rotationChange[0] + q.rotationChange[1] * q.rotationChange[1]
103                                          + q.rotationChange[2] * q.rotationChange[2]);
104                std::cout << "Rotation difference: " << rotDiff << " deg\n";
105                std::cout << "Mean Sampson error achievable = " << q.sampsonErrorNew << " px\n";
106                std::cout << "Mean Sampson error current    = " << q.sampsonErrorCurrent << " px\n";
107                std::cout << "Theoretical Depth Error Difference " << "@1m:" << std::fixed << std::setprecision(2) << q.depthErrorDifference[0] << "%, "
108                          << "2m:" << q.depthErrorDifference[1] << "%, " << "5m:" << q.depthErrorDifference[2] << "%, " << "10m:" << q.depthErrorDifference[3]
109                          << "%\n";
110
111                // Reset and start a new round if desired
112                dynCalibInputControl->send(DCC::startCalibration());
113            }
114        }
115
116        int key = cv::waitKey(1);
117        if(key == 'q') break;
118    }
119
120    return 0;
121}
For more information please follow the example's README.

Scenery guidelines

Good calibration scenes make it easier for algorithms to detect, match, and track features. Recommended characteristics:
  • Include textured objects at multiple depths.
  • Avoid blank walls or featureless surfaces.
  • Slowly move the camera to cover the full FOV; avoid sudden motions.
RecommendationOriginal Image VS. Feature Coverage (in green)
βœ…Ensure rich texture and visual detail - rich textures, edges, and objects evenly distributed across FOV create ideal calibration conditions
🚫Avoid flat or featureless surfaces - lack of textured surfaces or visually distinct objects provide little usable features
🚫Avoid reflective and transparent surfaces - reflective and transparent surfaces create false 3D features
🚫Avoid dark scenes - low contrast, shadows, and poorly lit scenes yield few detectable features
🚫Avoid repetitive patterns - many patterned regions look too similar to tell apart

PerformanceMode tuning

ModeWhen to use
DEFAULTBalanced accuracy vs. speed.
STATIC_SCENERYCamera is fixed, scene stable.
OPTIMIZE_SPEEDFastest calibration, reduced precision.
OPTIMIZE_PERFORMANCEMaximum precision in feature-rich scenes.
SKIP_CHECKSAutomated pipelines, were internal check to guarantee scene quality are ignored.
By combining appropriate scenery with the correct PerformanceMode, users can significantly improve calibration reliability and depth estimation quality.

Limitations & notes

  • Supported devices β€” Dynamic Calibration is available for:
    • All stereo OAK Series 2 cameras (excluding FFC)
    • All stereo OAK Series 4 cameras
  • DepthAI version β€” requires DepthAI 3.0 or later.
  • Re-calibrated parameters β€” updates extrinsics only; intrinsics remain unchanged.
  • OS support β€” Available on Linux, MacOS and Windows.
  • Absolute depth spec β€” DCL improves relative depth perception; absolute accuracy may still differ slightly from the original factory spec.

Troubleshooting

SymptomPossible causeFix
High reprojection errorIncorrect model name or HFOV in board configVerify board JSON and camera specs
Depth still incorrect after β€œsuccessful” DCLLeft / right cameras swappedSwap sockets or update board config and recalibrate
nullopt quality reportInsufficient scene coverageMove camera to capture richer textures
Runtime error: "The calibration on the device is too old to perform DynamicCalibration, full re-calibration required!"The device calibration is too outdated for dynamic recalibration to provide any benefit.A newer device is needed

See also