MobileNetSpatialDetectionNetwork

Spatial detection for the MobileNet NN. It is similar to a combination of the MobileNetDetectionNetwork and SpatialLocationCalculator.

How to place it

pipeline = dai.Pipeline()
mobilenetSpatial = pipeline.create(dai.node.MobileNetSpatialDetectionNetwork)
dai::Pipeline pipeline;
auto mobilenetSpatial = pipeline.create<dai::node::MobileNetSpatialDetectionNetwork>();

Inputs and Outputs

               ┌───────────────────┐
input          │                   │       passthrough
──────────────►│-------------------├─────────────────►
               │     MobileNet     │               out
               │     Spatial       ├─────────────────►
               │     Detection     │boundingBoxMapping
               │     Network       ├─────────────────►
inputDepth     │                   │  passthroughDepth
──────────────►│-------------------├─────────────────►
               └───────────────────┘

Message types

Usage

pipeline = dai.Pipeline()
mobilenetSpatial = pipeline.create(dai.node.MobileNetSpatialDetectionNetwork)

mobilenetSpatial.setBlobPath(nnBlobPath)
# Will ingore all detections whose confidence is below 50%
mobilenetSpatial.setConfidenceThreshold(0.5)
mobilenetSpatial.input.setBlocking(False)
# How big the ROI will be (smaller value can provide a more stable reading)
mobilenetSpatial.setBoundingBoxScaleFactor(0.5)
# Min/Max threshold. Values out of range will be set to 0 (invalid)
mobilenetSpatial.setDepthLowerThreshold(100)
mobilenetSpatial.setDepthUpperThreshold(5000)

# Link depth from the StereoDepth node
stereo.depth.link(mobilenetSpatial.inputDepth)
dai::Pipeline pipeline;
auto mobilenetSpatial = pipeline.create<dai::node::MobileNetSpatialDetectionNetwork>();

mobilenetSpatial->setBlobPath(nnBlobPath);
// Will ingore all detections whose confidence is below 50%
mobilenetSpatial->setConfidenceThreshold(0.5f);
mobilenetSpatial->input.setBlocking(false);
// How big the ROI will be (smaller value can provide a more stable reading)
mobilenetSpatial->setBoundingBoxScaleFactor(0.5f);
// Min/Max threshold. Values out of range will be set to 0 (invalid)
mobilenetSpatial->setDepthLowerThreshold(100);
mobilenetSpatial->setDepthUpperThreshold(5000);

// Link depth from the StereoDepth node
stereo->depth.link(mobilenetSpatial->inputDepth);

Examples of functionality

Spatial coordinate system

OAK camera uses left-handed (Cartesian) coordinate system for all spatial coordiantes.

https://github.com/luxonis/depthai-python/assets/18037362/f9bfaa0c-0286-46c0-910c-77c1337493e1

Middle of the frame is 0,0 in terms of X,Y coordinates. If you go up, Y will increase, and if you go right, X will increase.

Reference

class depthai.node.MobileNetSpatialDetectionNetwork

MobileNetSpatialDetectionNetwork node. Mobilenet-SSD based network with spatial location data.

class Connection

Connection between an Input and Output

class Id

Node identificator. Unique for every node on a single Pipeline

Properties

alias of depthai.SpatialDetectionNetworkProperties

property boundingBoxMapping

Outputs mapping of detected bounding boxes relative to depth map

Suitable for when displaying remapped bounding boxes on depth frame

getAssetManager(*args, **kwargs)

Overloaded function.

  1. getAssetManager(self: depthai.Node) -> depthai.AssetManager

Get node AssetManager as a const reference

  1. getAssetManager(self: depthai.Node) -> depthai.AssetManager

Get node AssetManager as a const reference

getConfidenceThreshold(self: depthai.node.DetectionNetwork)float

Retrieves threshold at which to filter the rest of the detections.

Returns

Detection confidence

getInputRefs(*args, **kwargs)

Overloaded function.

  1. getInputRefs(self: depthai.Node) -> List[depthai.Node.Input]

Retrieves reference to node inputs

  1. getInputRefs(self: depthai.Node) -> List[depthai.Node.Input]

Retrieves reference to node inputs

getInputs(self: depthai.Node) → List[depthai.Node.Input]

Retrieves all nodes inputs

getName(self: depthai.Node)str

Retrieves nodes name

getNumInferenceThreads(self: depthai.node.NeuralNetwork)int

How many inference threads will be used to run the network

Returns

Number of threads, 0, 1 or 2. Zero means AUTO

getOutputRefs(*args, **kwargs)

Overloaded function.

  1. getOutputRefs(self: depthai.Node) -> List[depthai.Node.Output]

Retrieves reference to node outputs

  1. getOutputRefs(self: depthai.Node) -> List[depthai.Node.Output]

Retrieves reference to node outputs

getOutputs(self: depthai.Node) → List[depthai.Node.Output]

Retrieves all nodes outputs

getParentPipeline(*args, **kwargs)

Overloaded function.

  1. getParentPipeline(self: depthai.Node) -> depthai.Pipeline

  2. getParentPipeline(self: depthai.Node) -> depthai.Pipeline

property id

Id of node

property input

Input message with data to be inferred upon Default queue is blocking with size 5

property inputDepth

Input message with depth data used to retrieve spatial information about detected object Default queue is non-blocking with size 4

property inputs

Inputs mapped to network inputs. Useful for inferring from separate data sources Default input is non-blocking with queue size 1 and waits for messages

property out

Outputs ImgDetections message that carries parsed detection results.

property outNetwork

Outputs unparsed inference results.

property passthrough

Passthrough message on which the inference was performed.

Suitable for when input queue is set to non-blocking behavior.

property passthroughDepth

Passthrough message for depth frame on which the spatial location calculation was performed.

Suitable for when input queue is set to non-blocking behavior.

property passthroughs

Passthroughs which correspond to specified input

setBlob(*args, **kwargs)

Overloaded function.

  1. setBlob(self: depthai.node.NeuralNetwork, blob: depthai.OpenVINO.Blob) -> None

Load network blob into assets and use once pipeline is started.

Parameter blob:

Network blob

  1. setBlob(self: depthai.node.NeuralNetwork, path: Path) -> None

Same functionality as the setBlobPath(). Load network blob into assets and use once pipeline is started.

Throws:

Error if file doesn’t exist or isn’t a valid network blob.

Parameter path:

Path to network blob

setBlobPath(self: depthai.node.NeuralNetwork, path: Path)None

Load network blob into assets and use once pipeline is started.

Throws:

Error if file doesn’t exist or isn’t a valid network blob.

Parameter path:

Path to network blob

setBoundingBoxScaleFactor(self: depthai.node.SpatialDetectionNetwork, scaleFactor: float)None

Specifies scale factor for detected bounding boxes.

Parameter scaleFactor:

Scale factor must be in the interval (0,1].

setConfidenceThreshold(self: depthai.node.DetectionNetwork, thresh: float)None

Specifies confidence threshold at which to filter the rest of the detections.

Parameter thresh:

Detection confidence must be greater than specified threshold to be added to the list

setDepthLowerThreshold(self: depthai.node.SpatialDetectionNetwork, lowerThreshold: int)None

Specifies lower threshold in depth units (millimeter by default) for depth values which will used to calculate spatial data

Parameter lowerThreshold:

LowerThreshold must be in the interval [0,upperThreshold] and less than upperThreshold.

setDepthUpperThreshold(self: depthai.node.SpatialDetectionNetwork, upperThreshold: int)None

Specifies upper threshold in depth units (millimeter by default) for depth values which will used to calculate spatial data

Parameter upperThreshold:

UpperThreshold must be in the interval (lowerThreshold,65535].

setNumInferenceThreads(self: depthai.node.NeuralNetwork, numThreads: int)None

How many threads should the node use to run the network.

Parameter numThreads:

Number of threads to dedicate to this node

setNumNCEPerInferenceThread(self: depthai.node.NeuralNetwork, numNCEPerThread: int)None

How many Neural Compute Engines should a single thread use for inference

Parameter numNCEPerThread:

Number of NCE per thread

setNumPoolFrames(self: depthai.node.NeuralNetwork, numFrames: int)None

Specifies how many frames will be available in the pool

Parameter numFrames:

How many frames will pool have

setSpatialCalculationAlgorithm(self: depthai.node.SpatialDetectionNetwork, calculationAlgorithm: depthai.SpatialLocationCalculatorAlgorithm)None

Specifies spatial location calculator algorithm: Average/Min/Max

Parameter calculationAlgorithm:

Calculation algorithm.

setSpatialCalculationStepSize(self: depthai.node.SpatialDetectionNetwork, stepSize: int)None

Specifies spatial location calculator step size for depth calculation. Step size 1 means that every pixel is taken into calculation, size 2 means every second etc.

Parameter stepSize:

Step size.

property spatialLocationCalculatorOutput

Output of SpatialLocationCalculator node, which is used internally by SpatialDetectionNetwork. Suitable when extra information is required from SpatialLocationCalculator node, e.g. minimum, maximum distance.

class dai::node::MobileNetSpatialDetectionNetwork : public dai::NodeCRTP<SpatialDetectionNetwork, MobileNetSpatialDetectionNetwork, SpatialDetectionNetworkProperties>

MobileNetSpatialDetectionNetwork node. Mobilenet-SSD based network with spatial location data.

Public Functions

MobileNetSpatialDetectionNetwork(const std::shared_ptr<PipelineImpl> &par, int64_t nodeId)

Got questions?

Head over to Discussion Forum for technical support or any other questions you might have.