Software Stack
DepthAI
  • DepthAI Components
    • AprilTags
    • Benchmark
    • Camera
    • DetectionNetwork
    • EdgeDetector
    • Events
    • FeatureTracker
    • HostNodes
    • ImageAlign
    • ImageManip
    • IMU
    • Misc
    • Modelzoo
    • NeuralNetwork
    • RecordReplay
    • RGBD
    • Script
    • SpatialDetectionNetwork
    • SpatialLocationCalculator
    • StereoDepth
    • Sync
    • SystemLogger
    • VideoEncoder
    • Visualizer
    • Warp
    • RVC2-specific
  • Advanced Tutorials
  • API Reference
  • Tools

ON THIS PAGE

  • NN Archive Superblob
  • Setup
  • Pipeline
  • Source code

NN Archive Superblob

The example showcases loading a YOLOv6-Nano superblob model from the model zoo, configuring a neural network node with a 416x416 RGB camera input, and processing detection and passthrough outputs.

Setup

This example requires the DepthAI v3 API, see installation instructions.

Pipeline

Source code

Python
C++

Python

Python
GitHub
1#!/usr/bin/env python3
2
3import time
4import depthai as dai
5
6# We will download a blob NNArchive from the model zoo
7# Pick your own model from
8modelDescription = dai.NNModelDescription()
9modelDescription.model = "yolov6-nano"
10modelDescription.platform = "RVC2"
11
12# Download model from zoo and load it
13archivePath = dai.getModelFromZoo(modelDescription, useCached=True)
14archive = dai.NNArchive(archivePath)
15
16# Archive knows it is a blob archive
17assert archive.getModelType() == dai.ModelType.SUPERBLOB
18
19# Therefore, getSuperBlob() is available
20assert archive.getSuperBlob() is not None
21
22# The archive is unpacked and thus a path to the superblob model is also available
23assert archive.getModelPath() is not None
24
25# There is no blob available
26assert archive.getBlob() is None
27
28# You can access any config version
29v1config: dai.nn_archive.v1.Config = archive.getConfig()
30
31# Print some config fields
32print("-" * 10)
33print("Config fields:")
34print(f"\tConfig version: {v1config.configVersion}")
35print(f"\tModel heads: {v1config.model.heads}")
36print(f"\tModel inputs: {v1config.model.inputs}")
37print(f"\tModel metadata: {v1config.model.metadata}")
38print(f"\tModel outputs: {v1config.model.outputs}")
39print("-" * 10)
40
41with dai.Pipeline() as pipeline:
42    # Color camera node
43    camRgb = pipeline.create(dai.node.Camera).build()
44    outCam = camRgb.requestOutput((416, 416), dai.ImgFrame.Type.BGR888p)
45
46    # Neural network node
47    blob = archive.getSuperBlob().getBlobWithNumShaves(6)
48    neuralNetwork = pipeline.create(dai.node.NeuralNetwork)
49    neuralNetwork.setBlob(blob)
50    neuralNetwork.setNumInferenceThreads(2)
51
52    # Linking
53    outCam.link(neuralNetwork.input)
54
55    nnDetectionQueue = neuralNetwork.out.createOutputQueue()
56    nnPassthroughQueue = neuralNetwork.passthrough.createOutputQueue()
57
58    pipeline.start()
59
60    while pipeline.isRunning():
61        in_nn = nnDetectionQueue.get()
62        in_nnPassthrough = nnPassthroughQueue.get()
63        print("Data received")
64        time.sleep(0.1)

Need assistance?

Head over to Discussion Forum for technical support or any other questions you might have.