# DynamicCalibration - technical implementation

DynamicCalibration is a self-calibration workflow built into DepthAI 3.0 that restores and maintains stereo accuracy when
temperature changes, physical shocks, or long-term drift degrade factory calibration.

On this page we layout the steps for technical implementation of the DynamicCalibration into your projects. If you want to know
more about general information about DynamicCalibration, visit [this
page](https://docs.luxonis.com/hardware/platform/depth/dynamic-calibration.md).

### Key capabilities

 * Restore depth performance—brings the disparity map back to optimal visual quality.
 * No targets required—operate in natural scenes; just move the camera to capture a varied view.
 * Rapid execution—typically completes in seconds.
 * Health monitoring—run diagnostics at any time without flashing new calibration.

## Automatic calibration options

There are two ways to run automatic calibration on top of DynamicCalibration:

 * Use the [AutoCalibration Host
   Node](https://docs.luxonis.com/software-v3/depthai/depthai-components/host_nodes/auto_calibration.md) for explicit node-level
   control inside your pipeline.
 * Use DEPTHAI_AUTOCALIBRATION for deployment-time enablement without changing pipeline code.
 * Using direct setter of Pipeline.setAutoCalibrationMode() to enable automatic calibration without using the AutoCalibration Host
   Node.

> From
> `DepthAI 3.6`
> , automatic calibration is enabled by default in
> `ON_START`
> mode. Use
> `OFF`
> to disable it. If your pipeline already includes a
> `DynamicCalibration`
> node or an
> `AutoCalibration`
> node,
> `AutoCalibrationMode`
> is not applied.

## Usage of the Dynamic Calibration Library (DCL)

This section is a simple high-level representation of how to use the DynamicCalibration node in DepthAI for dynamic calibration
workflows.

Dynamic calibration takes as input a
[DynamicCalibrationControl](https://docs.luxonis.com/software-v3/depthai/depthai-components/messages/dynamic_calibration_control.md)
message with a command (eg. ApplyCalibration, Calibrate, ... see all available commands in the message definition). The node
returns an output in one of the three output queues:

 * calibrationOutput with message type
   [DynamicCalibrationResult](https://docs.luxonis.com/software-v3/depthai/depthai-components/messages/dynamic_calibration_result.md),
 * qualityOutput with message type
   [CalibrationQuality](https://docs.luxonis.com/software-v3/depthai/depthai-components/messages/calibration_quality.md)
 * coverageOutput with message type
   [CoverageData](https://docs.luxonis.com/software-v3/depthai/depthai-components/messages/coverage_data.md)
 * metricsOutput with message type
   [CalibrationMetrics](https://docs.luxonis.com/software-v3/depthai/depthai-components/messages/calibration_metrics.md)

depending on which input command was sent.

In the next paragraphs we are going to explore how to actually implement DCL into your code.

#### Initializing the DynamicCalibration Node

The DynamicCalibration node requires two synchronized camera streams from the same device. Here's how to set it up:

```python
import depthai as dai

# initialize the pipeline
pipeline = dai.Pipeline()

# Create camera nodes
cam_left = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_B)
cam_right = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_C)

# Request full resolution NV12 outputs
left_out = cam_left.requestFullResolutionOutput()
right_out = cam_right.requestFullResolutionOutput()

# Initialize the DynamicCalibration node
dyn_calib = pipeline.create(dai.node.DynamicCalibration)

# Link the cameras to the DynamicCalibration
left_out.link(dyn_calib.left)
right_out.link(dyn_calib.right)

device = pipeline.getDefaultDevice()
calibration = device.readCalibration()
device.setCalibration(calibration)

pipeline.start()
while pipeline.isRunning():
    ...
```

#### Sending Commands to the Node

Nodes in DepthAI communicate via input/output message queues. The DynamicCalibration node has several queues, but the most
important for control is the inputControl queue.

```python
# Initialize the command input queue
command_input = dyn_calib.inputControl.createInputQueue()
# Example command of sending a 
command_input.send(dai.DynamicCalibrationControl.startCalibration())
```

Available Commands

 * StartCalibration() - Starts the calibration process.
 * StopCalibration() - Stops the calibration process.
 * Calibrate(force=False) - Computes a new calibration based on the loaded data.
   * force - no restriction on loaded data
 * CalibrationQuality(force=False) - Evaluates the quality of the current calibration.
   * force - no restriction on loaded data
 * LoadImage() - Load one image from the device.
 * ComputeCalibrationMetrics(calibration) - Compute calibration metrics as dataQuality and calibrationConfidence.
 * ApplyCalibration(calibration) - Apply calibration to the device.
 * SetPerformanceMode(performanceMode) - Send performance mode which will be used.
 * ResetData() - Remove all previously loaded data.

#### Receiving Data from the Node

The node provides multiple output queues:

 * coverageOutput → coverage statistics
   ([CoverageData](https://docs.luxonis.com/software-v3/depthai/depthai-components/messages/coverage_data.md))
 * calibrationOutput → calibration results
   ([DynamicCalibrationResult](https://docs.luxonis.com/software-v3/depthai/depthai-components/messages/dynamic_calibration_result.md))
 * qualityOutput → calibration quality check
   ([CalibrationQuality](https://docs.luxonis.com/software-v3/depthai/depthai-components/messages/calibration_quality.md))
 * metricsOutput → calibration statistics and data quality
   ([CalibrationMetrics](https://docs.luxonis.com/software-v3/depthai/depthai-components/messages/calibration_metrics.md))

```python
# queue for recieving new calibration
calibration_output = dyn_calib.calibrationOutput.createOutputQueue()
# queue for revieving the coverage
coverage_output = dyn_calib.coverageOutput.createOutputQueue()
# queue for checking the calibration quality 
quality_output = dyn_calib.qualityOutput.createOutputQueue()
# queue for checking the calibration metrics
metrics_output = dyn_calib.metricsOutput.createOutputQueue()
```

Check references of DynamicCalibrationResult, CoverageData, CalibrationQuality and CalibrationMetrics to see exact data structures
of the outputs.

#### Reading Coverage Data

Coverage data is received via coverageOutput when an image is loaded either manually (with LoadImage command) or during continuous
calibration (after running StartCalibration command).

Manual Image Load

```python
# Load a single image
command_input.send(dai.DynamicCalibrationControl.loadImage())

# Get coverage after loading
coverage = coverage_output.get()
print(f"Coverage = {coverage.meanCoverage}")
```

Continuous Collection During Calibration

```python
command_input.send(dai.DynamicCalibrationControl.startCalibration())

while pipeline.isRunning():
    # Blocking read
    coverage = coverage_output.get()
    print(f"Coverage = {coverage.meanCoverage}")

    # Non-blocking read
    coverage = coverage_output.tryGet()
    if coverage:
        print(f"Coverage = {coverage.meanCoverage}")
```

#### Reading Calibration Data

Calibration results can be obtained from:

 * dai.DynamicCalibrationControl.startCalibration() — starts collecting data and attempts calibration.
 * dai.DynamicCalibrationControl.calibrate(force=False) — calibrates with existing loaded data (image must be loaded beforehand
   with LoadImage command as can be seen in the example below).

Calibration data will be returned as
[DynamicCalibrationResult](https://docs.luxonis.com/software-v3/depthai/depthai-components/messages/dynamic_calibration_result.md)
message type.

Manual Image Load

```python
# Load a single image
command_inputsend.(dai.DynamicCalibrationControl.loadImage())

# Send a command to calibrate
command_input.send(dai.DynamicCalibrationControl.calibrate(force=False))

# Get calibration after loading
calibration = calibration_output.get()
print(f"Calibration = {calibration.info}")
```

Continuous Collection

```python
# Starts collecting data and attempts calibration
command_input.send(dai.DynamicCalibrationControl.startCalibration())

while pipeline.isRunning():
    # Blocking read
    calibration = calibration_output.get()
    print(f"Calibration = {calibration.info}")

    # Non-blocking read
    calibration = calibration_output.tryGet()
    if calibration:
        print(f"Calibration = {calibration.info}")
```

#### Performance Modes

Set the performance mode with

```python
# Set performance mode
dynCalibInputControl.send(dai.DynamicCalibrationControl.setPerformanceMode(dai.node.DynamicCalibration.OPTIMIZE_PERFORMANCE))
```

The performance mode sets the amount of data needed for the calibration.

```python
dai.node.DynamicCalibration.PerformanceMode.OPTIMIZE_PERFORMANCE  # The most strict mode
dai.node.DynamicCalibration.PerformanceMode.DEFAULT               # Less strict but mostly sufficient
dai.node.DynamicCalibration.PerformanceMode.OPTIMIZE_SPEED        # Optimize speed over performance
dai.node.DynamicCalibration.PerformanceMode.STATIC_SCENERY        # Not strict
dai.node.DynamicCalibration.PerformanceMode.SKIP_CHECKS           # Skip all internal checks
```

## Examples

#### Dynamic Calibration Interactive Visualizer

With the following commands you can clone and run the calibration integration.

```bash
git clone https://github.com/luxonis/depthai-core.git
cd depthai-core/
python3 -m venv venv
source venv/bin/activate
python3 examples/python/install_requirements.py
python3 examples/python/DynamicCalibration/calibration_integration.py
```

#### Calibration Quality Check

#### Python

Run this example by following the [README on
Github](https://github.com/luxonis/depthai-core/blob/main/examples/python/DynamicCalibration/README.md).

```python
import depthai as dai
import numpy as np
import time
import cv2

# ---------- Pipeline definition ----------
with dai.Pipeline() as pipeline:
    # Create camera nodes
    monoLeft  = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_B)
    monoRight = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_C)

    # Request full resolution NV12 outputs
    monoLeftOut  = monoLeft.requestFullResolutionOutput()
    monoRightOut = monoRight.requestFullResolutionOutput()

    # Initialize the DynamicCalibration node
    dynCalib = pipeline.create(dai.node.DynamicCalibration)

    # Link the cameras to the DynamicCalibration
    monoLeftOut.link(dynCalib.left)
    monoRightOut.link(dynCalib.right)

    stereo = pipeline.create(dai.node.StereoDepth)
    monoLeftOut.link(stereo.left)
    monoRightOut.link(stereo.right)

    # Queues
    syncedLeftQueue  = stereo.syncedLeft.createOutputQueue()
    syncedRightQueue = stereo.syncedRight.createOutputQueue()
    disparityQueue = stereo.disparity.createOutputQueue()

    # Initialize the command output queues for coverage and calibration quality
    dynCalibCoverageQueue = dynCalib.coverageOutput.createOutputQueue()
    dynCalibQualityQueue = dynCalib.qualityOutput.createOutputQueue()

    # Initialize the command input queue
    dynCalibInputControl = dynCalib.inputControl.createInputQueue()

    device = pipeline.getDefaultDevice()
    device.setCalibration(device.readCalibration())

    # Setup the colormap for visualization
    colorMap = cv2.applyColorMap(np.arange(256, dtype=np.uint8), cv2.COLORMAP_JET)
    colorMap[0] = [0, 0, 0]  # to make zero-disparity pixels black
    maxDisparity = 1

    pipeline.start()
    time.sleep(1) # wait for auto exposure to settle

    while pipeline.isRunning():
        leftSynced  = syncedLeftQueue.get()
        rightSynced = syncedRightQueue.get()
        disparity = disparityQueue.get()

        cv2.imshow("left", leftSynced.getCvFrame())
        cv2.imshow("right", rightSynced.getCvFrame())

        # --- Disparity visualization ---
        npDisparity = disparity.getFrame()
        curMax = float(np.max(npDisparity))
        if curMax > 0:
            maxDisparity = max(maxDisparity, curMax)
        normalized = (npDisparity / (maxDisparity if maxDisparity > 0 else 1.0) * 255.0).astype(np.uint8)
        colorizedDisparity = cv2.applyColorMap(normalized, cv2.COLORMAP_JET)
        colorizedDisparity[normalized == 0] = (0, 0, 0)
        cv2.imshow("disparity", colorizedDisparity)

        # --- Load one frame into calibration & read coverage
        dynCalibInputControl.send(dai.DynamicCalibrationControl.loadImage())
        coverage = dynCalibCoverageQueue.get()
        if coverage is not None:
            print(f"2D Spatial Coverage = {coverage.meanCoverage} / 100 [%]")
            print(f"Data Acquired       = {coverage.dataAcquired} / 100 [%]")

        # --- Request a quality evaluation & read result
        dynCalibInputControl.send(dai.DynamicCalibrationControl.calibrationQuality(False))
        dynQualityResult = dynCalibQualityQueue.get()
        if dynQualityResult is not None:
            print(f"Dynamic calibration status: {dynQualityResult.info}")

            # If the calibration is successfully returned apply it to the device
            if dynQualityResult.qualityData:
                q = dynQualityResult.qualityData
                print("Successfully evaluated Quality")
                rotDiff = float(np.sqrt(q.rotationChange[0]**2 +
                                        q.rotationChange[1]**2 +
                                        q.rotationChange[2]**2))
                print(f"Rotation difference: || r_current - r_new || = {rotDiff:.2f} deg")
                print(f"Mean Sampson error achievable = {q.sampsonErrorNew:.3f} px")
                print(f"Mean Sampson error current    = {q.sampsonErrorCurrent:.3f} px")
                print(
                    "Theoretical Depth Error Difference "
                    f"@1m:{q.depthErrorDifference[0]:.2f}%, "
                    f"2m:{q.depthErrorDifference[1]:.2f}%, "
                    f"5m:{q.depthErrorDifference[2]:.2f}%, "
                    f"10m:{q.depthErrorDifference[3]:.2f}%"
                )
                # Reset temporary accumulators before the next cycle
                dynCalibInputControl.send(dai.DynamicCalibrationControl.resetData())

        key = cv2.waitKey(1)
        if key == ord('q'):
            pipeline.stop()
            break
```

#### C++

Example from our
[Github](https://github.com/luxonis/depthai-core/blob/main/examples/cpp/DynamicCalibration/calibration_quality_dynamic.cpp):

```cpp
#include <chrono>
#include <cmath>
#include <iomanip>
#include <iostream>
#include <opencv2/opencv.hpp>
#include <thread>

#include "depthai/depthai.hpp"

int main() {
    auto device = std::make_shared<dai::Device>();

    // ---------- Pipeline definition ----------
    dai::Pipeline pipeline(device);

    auto monoLeft = pipeline.create<dai::node::Camera>()->build(dai::CameraBoardSocket::CAM_B);
    auto monoRight = pipeline.create<dai::node::Camera>()->build(dai::CameraBoardSocket::CAM_C);

    auto* leftOut = monoLeft->requestFullResolutionOutput();
    auto* rightOut = monoRight->requestFullResolutionOutput();

    // Dynamic-calibration node
    auto dynCalib = pipeline.create<dai::node::DynamicCalibration>();
    leftOut->link(dynCalib->left);
    rightOut->link(dynCalib->right);

    auto stereo = pipeline.create<dai::node::StereoDepth>();
    leftOut->link(stereo->left);
    rightOut->link(stereo->right);

    // In-pipeline host queues
    auto leftSyncedQueue = stereo->syncedLeft.createOutputQueue();
    auto rightSyncedQueue = stereo->syncedRight.createOutputQueue();
    auto disparityQueue = stereo->disparity.createOutputQueue();

    auto dynQualityOutQ = dynCalib->qualityOutput.createOutputQueue();
    auto dynCoverageOutQ = dynCalib->coverageOutput.createOutputQueue();
    auto dynCalibInputControl = dynCalib->inputControl.createInputQueue();

    device->setCalibration(device->readCalibration());

    pipeline.start();
    std::this_thread::sleep_for(std::chrono::seconds(1));  // wait for autoexposure to settle

    using DCC = dai::DynamicCalibrationControl;

    while(pipeline.isRunning()) {
        auto leftSynced = leftSyncedQueue->get<dai::ImgFrame>();
        auto rightSynced = rightSyncedQueue->get<dai::ImgFrame>();
        auto disparity = disparityQueue->get<dai::ImgFrame>();

        cv::imshow("left", leftSynced->getCvFrame());
        cv::imshow("right", rightSynced->getCvFrame());

        // --- Load one frame pair into the calibration pipeline
        dynCalibInputControl->send(DCC::loadImage());

        // Wait for coverage info
        auto coverageMsg = dynCoverageOutQ->get<dai::CoverageData>();
        if(coverageMsg) {
            std::cout << "2D Spatial Coverage = " << coverageMsg->meanCoverage << " / 100 [%]" << std::endl;
            std::cout << "Data Acquired       = " << coverageMsg->dataAcquired << " / 100 [%]" << std::endl;
        }

        // Request a calibration quality evaluation (non-forced)
        dynCalibInputControl->send(DCC::calibrationQuality(false));

        // Wait for calibration result
        auto dynCalibrationResult = dynQualityOutQ->get<dai::CalibrationQuality>();
        if(dynCalibrationResult) {
            std::cout << "Dynamic calibration status: " << dynCalibrationResult->info << std::endl;

            if(dynCalibrationResult->qualityData) {
                std::cout << "Successfully evaluated Quality." << std::endl;

                const auto& q = *dynCalibrationResult->qualityData;

                // --- Rotation difference magnitude (degrees) ---
                float rotDiff = std::sqrt(q.rotationChange[0] * q.rotationChange[0] + q.rotationChange[1] * q.rotationChange[1]
                                          + q.rotationChange[2] * q.rotationChange[2]);
                std::cout << "Rotation difference: || r_current - r_new || = " << rotDiff << " deg" << std::endl;

                // --- Sampson error (px) ---
                std::cout << "Mean Sampson error achievable = " << q.sampsonErrorNew << " px" << std::endl;
                std::cout << "Mean Sampson error current    = " << q.sampsonErrorCurrent << " px" << std::endl;

                // --- Depth error difference (%) at 1/2/5/10 m ---
                std::cout << "Theoretical Depth Error Difference " << "@1m:" << std::fixed << std::setprecision(2) << q.depthErrorDifference[0] << "%, "
                          << "2m:" << q.depthErrorDifference[1] << "%, " << "5m:" << q.depthErrorDifference[2] << "%, " << "10m:" << q.depthErrorDifference[3]
                          << "%" << std::endl;

                // (Optional) Trigger a calibration step if desired:
                // dynCalibInputControl->send(DCC::calibrate(true));

                // Reset temporary data after reading metrics
                dynCalibInputControl->send(DCC::resetData());
            }
        } else {
            std::cout << "Dynamic calibration: no result received." << std::endl;
        }

        int key = cv::waitKey(1);
        if(key == 'q') break;
    }

    return 0;
}
```

#### Dynamic Calibration

#### Python

Run this example by following the [README on
Github](https://github.com/luxonis/depthai-core/blob/main/examples/python/DynamicCalibration/README.md).

```python
import depthai as dai
import numpy as np
import time
import cv2

# ---------- Pipeline definition ----------
with dai.Pipeline() as pipeline:
    # Cameras
    monoLeft  = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_B)
    monoRight = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_C)

    # Full-res NV12 outputs
    monoLeftOut  = monoLeft.requestFullResolutionOutput()
    monoRightOut = monoRight.requestFullResolutionOutput()

    # Initialize the DynamicCalibration node
    dynCalib = pipeline.create(dai.node.DynamicCalibration)

    # Link the cameras to the DynamicCalibration
    monoLeftOut.link(dynCalib.left)
    monoRightOut.link(dynCalib.right)

    # Stereo (for disparity + synced previews)
    stereo = pipeline.create(dai.node.StereoDepth)
    monoLeftOut.link(stereo.left)
    monoRightOut.link(stereo.right)

    # Output queues
    syncedLeftQueue  = stereo.syncedLeft.createOutputQueue()
    syncedRightQueue = stereo.syncedRight.createOutputQueue()
    disparityQueue   = stereo.disparity.createOutputQueue()

    # Initialize the command output queues for calibration and coverage
    dynCalibCalibrationQueue = dynCalib.calibrationOutput.createOutputQueue()
    dynCalibCoverageQueue    = dynCalib.coverageOutput.createOutputQueue()

    # Initialize the command input queue
    dynCalibInputControl = dynCalib.inputControl.createInputQueue()

    device = pipeline.getDefaultDevice()
    device.setCalibration(device.readCalibration())

    # Setup the colormap for visualization
    colorMap = cv2.applyColorMap(np.arange(256, dtype=np.uint8), cv2.COLORMAP_JET)
    colorMap[0] = [0, 0, 0]  # to make zero-disparity pixels black
    maxDisparity = 1.0

    pipeline.start()
    time.sleep(1) # wait for auto exposure to settle

    # Set performance mode
    dynCalibInputControl.send(
        dai.DynamicCalibrationControl.setPerformanceMode(
            dai.DynamicCalibrationControl.OPTIMIZE_PERFORMANCE
        )
    )

    # Start periodic calibration
    dynCalibInputControl.send(
        dai.DynamicCalibrationControl.startCalibration()
    )

    while pipeline.isRunning():
        leftSynced  = syncedLeftQueue.get()
        rightSynced = syncedRightQueue.get()
        disparity = disparityQueue.get()

        cv2.imshow("left", leftSynced.getCvFrame())
        cv2.imshow("right", rightSynced.getCvFrame())

        # --- Disparity visualization ---
        npDisparity = disparity.getFrame()
        curMax = float(np.max(npDisparity))
        if curMax > 0:
            maxDisparity = max(maxDisparity, curMax)

        # Normalize to [0,255] and colorize; keep zero-disparity as black
        denom = maxDisparity if maxDisparity > 0 else 1.0
        normalized = (npDisparity / denom * 255.0).astype(np.uint8)
        colorizedDisparity = cv2.applyColorMap(normalized, cv2.COLORMAP_JET)
        colorizedDisparity[normalized == 0] = (0, 0, 0)
        cv2.imshow("disparity", colorizedDisparity)

        # --- Coverage (non-blocking) ---
        coverage = dynCalibCoverageQueue.tryGet()
        if coverage is not None:
            print(f"2D Spatial Coverage = {coverage.meanCoverage} / 100 [%]")
            print(f"Data Acquired       = {coverage.dataAcquired} / 100 [%]")

        # --- Calibration result (non-blocking) ---
        dynCalibrationResult = dynCalibCalibrationQueue.tryGet()
        calibrationData = dynCalibrationResult.calibrationData if dynCalibrationResult is not None else None

        if dynCalibrationResult is not None:
            print(f"Dynamic calibration status: {dynCalibrationResult.info}")

        # --- Apply calibration if available, print quality deltas, then reset+continue ---
        if calibrationData:
            print("Successfully calibrated")
            # Apply to device
            dynCalibInputControl.send(
                dai.DynamicCalibrationControl.applyCalibration(calibrationData.newCalibration)
            )

            q = calibrationData.calibrationDifference
            rotDiff = float(np.sqrt(q.rotationChange[0]**2 +
                                    q.rotationChange[1]**2 +
                                    q.rotationChange[2]**2))
            print(f"Rotation difference: || r_current - r_new || = {rotDiff:.2f} deg")
            print(f"Mean Sampson error achievable = {q.sampsonErrorNew:.3f} px")
            print(f"Mean Sampson error current    = {q.sampsonErrorCurrent:.3f} px")
            print("Theoretical Depth Error Difference "
                  f"@1m:{q.depthErrorDifference[0]:.2f}%, "
                  f"2m:{q.depthErrorDifference[1]:.2f}%, "
                  f"5m:{q.depthErrorDifference[2]:.2f}%, "
                  f"10m:{q.depthErrorDifference[3]:.2f}%")

            # Reset accumulators and continue periodic calibration
            dynCalibInputControl.send(
                dai.DynamicCalibrationControl.resetData()
            )
            dynCalibInputControl.send(
                dai.DynamicCalibrationControl.startCalibration()
            )

        key = cv2.waitKey(1)
        if key == ord('q'):
            pipeline.stop()
            break
```

#### C++

Example from our
[Github](https://github.com/luxonis/depthai-core/blob/main/examples/cpp/DynamicCalibration/calibration_dynamic.cpp):

```cpp
// examples/cpp/DynamicCalibration/calibrate.cpp
#include <algorithm>
#include <chrono>
#include <cmath>
#include <iomanip>
#include <iostream>
#include <opencv2/opencv.hpp>
#include <thread>

#include "depthai/depthai.hpp"

int main() {
    auto device = std::make_shared<dai::Device>();

    // ---------- Pipeline definition ----------
    dai::Pipeline pipeline(device);

    auto monoLeft = pipeline.create<dai::node::Camera>()->build(dai::CameraBoardSocket::CAM_B);
    auto monoRight = pipeline.create<dai::node::Camera>()->build(dai::CameraBoardSocket::CAM_C);

    auto* leftOut = monoLeft->requestFullResolutionOutput();
    auto* rightOut = monoRight->requestFullResolutionOutput();

    // Dynamic-calibration node
    auto dynCalib = pipeline.create<dai::node::DynamicCalibration>();
    leftOut->link(dynCalib->left);
    rightOut->link(dynCalib->right);

    auto stereo = pipeline.create<dai::node::StereoDepth>();
    leftOut->link(stereo->left);
    rightOut->link(stereo->right);

    // In-pipeline host queues
    auto leftSyncedQueue = stereo->syncedLeft.createOutputQueue();
    auto rightSyncedQueue = stereo->syncedRight.createOutputQueue();
    auto disparityQueue = stereo->disparity.createOutputQueue();

    auto dynCalibOutQ = dynCalib->calibrationOutput.createOutputQueue();
    auto dynCoverageOutQ = dynCalib->coverageOutput.createOutputQueue();

    auto dynCalibInputControl = dynCalib->inputControl.createInputQueue();

    device->setCalibration(device->readCalibration());

    pipeline.start();
    std::this_thread::sleep_for(std::chrono::seconds(1));  // wait for autoexposure to settle

    using DCC = dai::DynamicCalibrationControl;
    // Optionally set performance mode:
    dynCalibInputControl->send(DCC::setPerformanceMode(DCC::PerformanceMode::OPTIMIZE_PERFORMANCE));

    // Start calibration (optimize performance)
    dynCalibInputControl->send(DCC::startCalibration());

    double maxDisparity = 1.0;
    while(pipeline.isRunning()) {
        auto leftSynced = leftSyncedQueue->get<dai::ImgFrame>();
        auto rightSynced = rightSyncedQueue->get<dai::ImgFrame>();
        auto disparity = disparityQueue->get<dai::ImgFrame>();

        cv::imshow("left", leftSynced->getCvFrame());
        cv::imshow("right", rightSynced->getCvFrame());

        cv::Mat npDisparity = disparity->getFrame();

        double minVal = 0.0, curMax = 0.0;
        cv::minMaxLoc(npDisparity, &minVal, &curMax);
        maxDisparity = std::max(maxDisparity, curMax);

        // Normalize the disparity image to an 8-bit scale.
        cv::Mat normalized;
        npDisparity.convertTo(normalized, CV_8UC1, 255.0 / (maxDisparity > 0 ? maxDisparity : 1.0));

        cv::Mat colorizedDisparity;
        cv::applyColorMap(normalized, colorizedDisparity, cv::COLORMAP_JET);

        // Set pixels with zero disparity to black.
        colorizedDisparity.setTo(cv::Scalar(0, 0, 0), normalized == 0);

        cv::imshow("disparity", colorizedDisparity);

        // Coverage (non-blocking)
        if(auto coverageMsg = dynCoverageOutQ->tryGet<dai::CoverageData>()) {
            std::cout << "2D Spatial Coverage = " << coverageMsg->meanCoverage << "  / 100 [%]\n";
            std::cout << "Data Acquired       = " << coverageMsg->dataAcquired << "  / 100 [%]\n";
        }

        // Calibration result (non-blocking)
        if(auto dynCalibrationResult = dynCalibOutQ->tryGet<dai::DynamicCalibrationResult>()) {
            std::cout << "Dynamic calibration status: " << dynCalibrationResult->info << std::endl;

            if(dynCalibrationResult->calibrationData) {
                std::cout << "Successfully calibrated." << std::endl;

                // Apply the produced calibration
                const auto& newCalib = dynCalibrationResult->calibrationData->newCalibration;
                dynCalibInputControl->send(DCC::applyCalibration(newCalib));

                // Print quality deltas
                const auto& q = dynCalibrationResult->calibrationData->calibrationDifference;

                float rotDiff = std::sqrt(q.rotationChange[0] * q.rotationChange[0] + q.rotationChange[1] * q.rotationChange[1]
                                          + q.rotationChange[2] * q.rotationChange[2]);
                std::cout << "Rotation difference: " << rotDiff << " deg\n";
                std::cout << "Mean Sampson error achievable = " << q.sampsonErrorNew << " px\n";
                std::cout << "Mean Sampson error current    = " << q.sampsonErrorCurrent << " px\n";
                std::cout << "Theoretical Depth Error Difference " << "@1m:" << std::fixed << std::setprecision(2) << q.depthErrorDifference[0] << "%, "
                          << "2m:" << q.depthErrorDifference[1] << "%, " << "5m:" << q.depthErrorDifference[2] << "%, " << "10m:" << q.depthErrorDifference[3]
                          << "%\n";

                // Reset and start a new round if desired
                dynCalibInputControl->send(DCC::startCalibration());
            }
        }

        int key = cv::waitKey(1);
        if(key == 'q') break;
    }

    return 0;
}
```

> Dynamic Calibration will not fully restore the factory-specified absolute depth accuracy. Use factory tools for production-grade
re-calibration.

For more information please follow the example's README.

## Scenery guidelines

Good calibration scenes make it easier for algorithms to detect, match, and track features. Recommended characteristics:

 * Include textured objects at multiple depths.
 * Avoid blank walls or featureless surfaces.
 * Slowly move the camera to cover the full FOV; avoid sudden motions.

| Recommendation | Original Image VS. Feature Coverage (in green) |
| --- | --- |
| ✅**Ensure rich texture and visual detail** - rich textures, edges, and objects evenly distributed across FOV create ideal
calibration conditions | |
| 🚫**Avoid flat or featureless surfaces** - lack of textured surfaces or visually distinct objects provide little usable features
| |
| 🚫**Avoid reflective and transparent surfaces** - reflective and transparent surfaces create false 3D features | |
| 🚫**Avoid dark scenes** - low contrast, shadows, and poorly lit scenes yield few detectable features | |
| 🚫**Avoid repetitive patterns** - many patterned regions look too similar to tell apart | |

## PerformanceMode tuning

| Mode | When to use |
| --- | --- |
| `DEFAULT` | Balanced accuracy vs. speed. |
| `STATIC_SCENERY` | Camera is fixed, scene stable. |
| `OPTIMIZE_SPEED` | Fastest calibration, reduced precision. |
| `OPTIMIZE_PERFORMANCE` | Maximum precision in feature-rich scenes. |
| `SKIP_CHECKS` | Automated pipelines, were internal check to guarantee scene quality are ignored. |

> For highest accuracy, combine
> **OPTIMIZE_PERFORMANCE**
> with a dynamic, well-featured environment.

By combining appropriate scenery with the correct PerformanceMode, users can significantly improve calibration reliability and
depth estimation quality.

## Limitations & notes

 * Supported devices — Dynamic Calibration is available for:
   * All stereo OAK Series 2 cameras (excluding FFC)
   * All stereo OAK Series 4 cameras
 * DepthAI version — requires DepthAI 3.0 or later.
 * Re-calibrated parameters — updates extrinsics only; intrinsics remain unchanged.
 * OS support — Available on Linux, MacOS and Windows.
 * Absolute depth spec — DCL improves relative depth perception; absolute accuracy may still differ slightly from the original
   factory spec.

## Troubleshooting

| Symptom | Possible cause | Fix |
| --- | --- | --- |
| *High reprojection error* | Incorrect model name or HFOV in board config | Verify board JSON and camera specs |
| Depth still incorrect after “successful” DCL | Left / right cameras swapped | Swap sockets or update board config and
recalibrate |
| `nullopt` quality report | Insufficient scene coverage | Move camera to capture richer textures |
| Runtime error: `"The calibration on the device is too old to perform DynamicCalibration, full re-calibration required!"` | The
device calibration is too outdated for dynamic recalibration to provide any benefit. | A newer device is needed |

## See also

 * [General Dynamic Calibration information page](https://docs.luxonis.com/hardware/platform/depth/dynamic-calibration.md)
 * [Manual stereo and ToF calibration guide](https://docs.luxonis.com/hardware/platform/depth/manual-calibration.md)
 * If you'd like automatic calibration operation, have a look at the [AutoCalibration documentation
   page](https://docs.luxonis.com/software-v3/depthai/depthai-components/host_nodes/auto_calibration.md).
