MobileNet detection network node is very similar to NeuralNetwork (in fact it extends it). The only difference is that this node is specifically for the MobileNet NN and it decodes the result of the NN on device. This means that out of this node is not a byte array but a ImgDetections that can easily be used in your code.

How to place it

pipeline = dai.Pipeline()
mobilenetDet = pipeline.createMobileNetDetectionNetwork()
dai::Pipeline pipeline;
auto mobilenetDet = pipeline.create<dai::node::MobileNetDetectionNetwork>();

Inputs and Outputs

            │                   │       out
            │                   ├───────────►
            │     MobileNet     │
            │     Detection     │
input       │     Network       │ passthrough
            │                   │

Message types


pipeline = dai.Pipeline()
mobilenetDet = pipeline.createMobileNetDetectionNetwork()

dai::Pipeline pipeline;
auto mobilenetDet = pipeline.create<dai::node::MobileNetDetectionNetwork>();



class depthai.MobileNetDetectionNetwork

MobileNetDetectionNetwork node. Parses MobileNet results

class Connection

Connection between an Input and Output

class Id

Node identificator. Unique for every node on a single Pipeline


alias of depthai.DetectionNetworkProperties

getAssets(self: depthai.Node) → List[depthai.Asset]

Retrieves all nodes assets

getInputs(self: depthai.Node) → List[dai::Node::Input]

Retrieves all nodes inputs

getName(self: depthai.Node)str

Retrieves nodes name

getNumInferenceThreads(self: depthai.NeuralNetwork)int

How many inference threads will be used to run the network


Number of threads, 0, 1 or 2. Zero means AUTO

getOutputs(self: depthai.Node) → List[dai::Node::Output]

Retrieves all nodes outputs

property id

Id of node

property input

Input message with data to be infered upon Default queue is blocking with size 5

property out

Outputs ImgDetections message that carries parsed detection results.

property passthrough

Passthrough message on which the inference was performed.

Suitable for when input queue is set to non-blocking behavior.

setBlobPath(self: depthai.NeuralNetwork, path: str)None

Load network blob into assets and use once pipeline is started.

Throws if file doesn’t exist or isn’t a valid network blob.

Parameter path:

Path to network blob

setConfidenceThreshold(self: depthai.DetectionNetwork, thresh: float)None

Specifies confidence threshold at which to filter the rest of the detections.

Parameter thresh:

Detection confidence must be greater than specified threshold to be added to the list

setNumInferenceThreads(self: depthai.NeuralNetwork, numThreads: int)None

How many threads should the node use to run the network.

Parameter numThreads:

Number of threads to dedicate to this node

setNumNCEPerInferenceThread(self: depthai.NeuralNetwork, numNCEPerThread: int)None

How many Neural Compute Engines should a single thread use for inference

Parameter numNCEPerThread:

Number of NCE per thread

setNumPoolFrames(self: depthai.NeuralNetwork, numFrames: int)None

Specifies how many frames will be avilable in the pool

Parameter numFrames:

How many frames will pool have

class dai::node::MobileNetDetectionNetwork : public dai::node::DetectionNetwork

MobileNetDetectionNetwork node. Parses MobileNet results.

Public Functions

MobileNetDetectionNetwork(const std::shared_ptr<PipelineImpl> &par, int64_t nodeId)

Got questions?

We’re always happy to help with code or other questions you might have.