The Ultralytics Python package provides a user-friendly interface for training, evaluating, and deploying deep learning models. It supports both custom dataset training and pre-trained model inference, making it accessible to beginners while offering advanced customization options for experienced developers.Ultralytics is best known for spearheading the development of YOLO models, including the latest versions such as YOLOv8, YOLOv10, and YOLOv11. The package also supports other architectures like SAM, DETR, and more. You can read about it in the official documentation.Key Features:
Ready-to-use pre-trained models for quick deployment.
Custom model training with minimal configuration.
Support for multiple export formats, including ONNX, TensorRT, and OpenVINO.
Introduction to YOLO Models
YOLO (You Only Look Once) is a family of real-time object detection models known for their speed and accuracy. Unlike traditional detection methods that analyze images in multiple steps, YOLO models frame detection as a single-pass regression problem, predicting bounding boxes and class probabilities directly from an entire image in one evaluation.Initially, YOLO models were designed for object detection, but newer versions now support multi-head tasks such as pose estimation, segmentation, and more.
Usage
We’ve made it easy to deploy YOLO models trained with Ultralytics on Luxonis devices. To help you get started, we provide end-to-end tutorials, categorized by task:Object Detection
Decoding raw outputs into bounding boxes (and keypoints or instance masks for certain models).
Non-Maximum Suppression (NMS) to filter overlapping detections.
Once your object detection model is converted into an RVC-ready format, you can integrate it using the DetectionNetwork node, which handles all necessary post-processing.For instance keypoint detection or instance segmentation models, refer to the see the YOLO Models section on the Inference page. In these cases, use the DepthAI Nodes package and set YOLOExtendedParser as the heads parser in NNArchive.