NeuralDepth
Supported on:RVC4
NeuralDepth is only supported on RVC4 devices with Luxonis OS 1.20.4 or newer installed.
How to place it
Python
C++
Python
Python
1pipeline = dai.Pipeline()
2neuralDepth = pipeline.create(dai.node.NeuralDepth)Inputs and Outputs
Model Sizes
| Model | Resolution | FPS |
|---|---|---|
NEURAL_DEPTH_LARGE | 768x480 | 10.8 |
NEURAL_DEPTH_MEDIUM | 576x360 | 25.5 |
NEURAL_DEPTH_SMALL (default) | 480x300 | 42.5 |
NEURAL_DEPTH_NANO | 384x240 | 59.7 |
Post-Processing
- Confidence Threshold: Pixels with confidence below this threshold are considered invalid. Valid range is [0, 255], default is 125.
- Edge Threshold: Pixels with edge magnitude below this value are considered invalid. Valid range is [0, 255], default is 10.
- Temporal Filter: Averages depth values over time to reduce noise. The filter has the following parameters:
enable: Boolean to enable/disable the filter (default: false)alpha: Exponential moving average smoothing factor in range [0, 1]. Lower values provide more smoothing (great for static scenes), a value of 1.0 effectively acts as a passthrough (default: 0.4)delta: Threshold for disparity change detection in range [0, 255]. Only pixels with disparity changes (current frame vs previous frame) below this value are filtered, otherwise filtering for that pixel is disabled (default: 100)
Rectification
setRectification(false).Usage
Python
C++
Python
Python
1import depthai as dai
2
3pipeline = dai.Pipeline()
4
5# Create mono cameras
6monoLeft = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_B)
7monoRight = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_C)
8
9# Create Outputs
10leftOutput = monoLeft.requestOutput((640, 400))
11rightOutput = monoRight.requestOutput((640, 400))
12
13# Create NeuralDepth node
14neuralDepth = pipeline.create(dai.node.NeuralDepth)
15neuralDepth.build(leftOutput, rightOutput, dai.DeviceModelZoo.NEURAL_DEPTH_SMALL)
16
17# Configure post-processing
18neuralDepth.initialConfig.setConfidenceThreshold(125)
19neuralDepth.initialConfig.setEdgeThreshold(10)
20
21neuralDepth.initialConfig.postProcessing.temporalFilter.enable = True
22neuralDepth.initialConfig.postProcessing.temporalFilter.delta = 100
23neuralDepth.initialConfig.postProcessing.temporalFilter.alpha = 0.4
24
25# Create output queue
26depthQueue = neuralDepth.depth.createOutputQueue()Examples of functionality
- Neural Depth Minimal - Minimal example showing basic NeuralDepth usage with disparity output visualization.
- Neural Depth - Demonstrates the NeuralDepth node with runtime configuration of confidence threshold, edge threshold, and temporal filtering.
- Neural Depth RGBD - Combines NeuralDepth with the RGBD node to generate a point cloud, viewable via remote connection.
- Neural Depth Align - Demonstrates aligning NeuralDepth output to an RGB camera using the ImageAlign node.
Reference
class
dai::node::NeuralDepth
variable
std::shared_ptr< NeuralDepthConfig > initialConfig
variable
variable
Subnode< MessageDemux > messageDemux
variable
Subnode< Rectification > rectification
variable
Subnode< NeuralNetwork > neuralNetwork
variable
variable
variable
variable
variable
Input inputConfig
Input config to modify parameters in runtime.
variable
variable
Input leftInternal
Input left frame internal, used to extract frame info
variable
Input rightInternal
Input right frame internal, used to extract frame info
variable
variable
variable
variable
function
NeuralDepth & setRectification(bool enable)Enable or disable rectification (useful for prerectified inputs)
function
std::shared_ptr< NeuralDepth > build(Output & left, Output & right, DeviceModelZoo model)function
void buildInternal()Need assistance?
Head over to Discussion Forum for technical support or any other questions you might have.