Software Stack
DepthAI
  • DepthAI Components
    • AprilTags
    • Benchmark
    • Camera
    • Calibration
    • DetectionNetwork
    • EdgeDetector
    • Events
    • FeatureTracker
    • HostNodes
    • ImageAlign
    • ImageManip
    • IMU
    • Misc
    • Model Zoo
    • NeuralNetwork
    • ObjectTracker
    • RecordReplay
    • RGBD
    • Script
    • SpatialDetectionNetwork
    • SpatialLocationCalculator
    • StereoDepth
    • Sync
    • SystemLogger
    • VideoEncoder
    • Visualizer
    • Warp
    • RVC2-specific
  • Advanced Tutorials
  • API Reference
  • Tools

ON THIS PAGE

  • NN Archive Superblob
  • Pipeline
  • Source code

NN Archive Superblob

The example showcases loading a YOLOv6-Nano superblob model from the model zoo, configuring a neural network node with a 416x416 RGB camera input, and processing detection and passthrough outputs.This example requires the DepthAI v3 API, see installation instructions.

Pipeline

Source code

Python
C++

Python

Python
GitHub
1#!/usr/bin/env python3
2
3import time
4import depthai as dai
5
6# We will download a blob NNArchive from the model zoo
7# Pick your own model from
8modelDescription = dai.NNModelDescription()
9modelDescription.model = "yolov6-nano"
10modelDescription.platform = "RVC2"
11
12# Download model from zoo and load it
13archivePath = dai.getModelFromZoo(modelDescription, useCached=True)
14archive = dai.NNArchive(archivePath)
15
16# Archive knows it is a blob archive
17assert archive.getModelType() == dai.ModelType.SUPERBLOB
18
19# Therefore, getSuperBlob() is available
20assert archive.getSuperBlob() is not None
21
22# There is no blob or other model format available
23assert archive.getBlob() is None
24assert archive.getOtherModelFormat() is None
25
26# You can access any config version
27v1config: dai.nn_archive.v1.Config = archive.getConfig()
28
29# Print some config fields
30print("-" * 10)
31print("Config fields:")
32print(f"\tConfig version: {v1config.configVersion}")
33print(f"\tModel heads: {v1config.model.heads}")
34print(f"\tModel inputs: {v1config.model.inputs}")
35print(f"\tModel metadata: {v1config.model.metadata}")
36print(f"\tModel outputs: {v1config.model.outputs}")
37print("-" * 10)
38
39with dai.Pipeline() as pipeline:
40    # Color camera node
41    camRgb = pipeline.create(dai.node.Camera).build()
42    outCam = camRgb.requestOutput((416, 416), dai.ImgFrame.Type.BGR888p)
43
44    # Neural network node
45    blob = archive.getSuperBlob().getBlobWithNumShaves(6)
46    neuralNetwork = pipeline.create(dai.node.NeuralNetwork)
47    neuralNetwork.setBlob(blob)
48    neuralNetwork.setNumInferenceThreads(2)
49
50    # Linking
51    outCam.link(neuralNetwork.input)
52
53    nnDetectionQueue = neuralNetwork.out.createOutputQueue()
54    nnPassthroughQueue = neuralNetwork.passthrough.createOutputQueue()
55
56    pipeline.start()
57
58    while pipeline.isRunning():
59        in_nn = nnDetectionQueue.get()
60        in_nnPassthrough = nnPassthroughQueue.get()
61        print("Data received")
62        time.sleep(0.1)

Need assistance?

Head over to Discussion Forum for technical support or any other questions you might have.