OpenCV support¶
This example shows API which exposes both numpy and OpenCV compatible image types for eaiser usage. It uses ColorCamera node to retrieve both BGR interleaved ‘preview’ and NV12 encoded ‘video’ frames. Both are displayed using functions getFrame and getCvFrame.
Demo¶
Setup¶
Please run the install script to download all required dependencies. Please note that this script must be ran from git context, so you have to download the depthai-python repository first and then run the script
git clone https://github.com/luxonis/depthai-python.git
cd depthai-python/examples
python3 install_requirements.py
For additional information, please follow installation guide
Source code¶
Also available on GitHub
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 | #!/usr/bin/env python3 import cv2 import depthai as dai # Create pipeline pipeline = dai.Pipeline() # Define source and outputs camRgb = pipeline.create(dai.node.ColorCamera) xoutVideo = pipeline.create(dai.node.XLinkOut) xoutPreview = pipeline.create(dai.node.XLinkOut) xoutVideo.setStreamName("video") xoutPreview.setStreamName("preview") # Properties camRgb.setPreviewSize(300, 300) camRgb.setBoardSocket(dai.CameraBoardSocket.RGB) camRgb.setResolution(dai.ColorCameraProperties.SensorResolution.THE_1080_P) camRgb.setInterleaved(True) camRgb.setColorOrder(dai.ColorCameraProperties.ColorOrder.BGR) # Linking camRgb.video.link(xoutVideo.input) camRgb.preview.link(xoutPreview.input) # Connect to device and start pipeline with dai.Device(pipeline) as device: video = device.getOutputQueue('video') preview = device.getOutputQueue('preview') while True: videoFrame = video.get() previewFrame = preview.get() # Get BGR frame from NV12 encoded video frame to show with opencv cv2.imshow("video", videoFrame.getCvFrame()) # Show 'preview' frame as is (already in correct format, no copy is made) cv2.imshow("preview", previewFrame.getFrame()) if cv2.waitKey(1) == ord('q'): break |
Also available on GitHub
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 | #include <iostream> // Includes common necessary includes for development using depthai library #include "depthai/depthai.hpp" // Include OpenCV #include <opencv2/opencv.hpp> int main() { // Create pipeline dai::Pipeline pipeline; // Define source and outputs auto camRgb = pipeline.create<dai::node::ColorCamera>(); auto xoutVideo = pipeline.create<dai::node::XLinkOut>(); auto xoutPreview = pipeline.create<dai::node::XLinkOut>(); xoutVideo->setStreamName("video"); xoutPreview->setStreamName("preview"); // Properties camRgb->setPreviewSize(300, 300); camRgb->setBoardSocket(dai::CameraBoardSocket::RGB); camRgb->setResolution(dai::ColorCameraProperties::SensorResolution::THE_1080_P); camRgb->setInterleaved(true); camRgb->setColorOrder(dai::ColorCameraProperties::ColorOrder::BGR); // Linking camRgb->video.link(xoutVideo->input); camRgb->preview.link(xoutPreview->input); // Connect to device and start pipeline dai::Device device(pipeline); auto video = device.getOutputQueue("video"); auto preview = device.getOutputQueue("preview"); while(true) { auto videoFrame = video->get<dai::ImgFrame>(); auto previewFrame = preview->get<dai::ImgFrame>(); // Get BGR frame from NV12 encoded video frame to show with opencv cv::imshow("video", videoFrame->getCvFrame()); // Show 'preview' frame as is (already in correct format, no copy is made) cv::imshow("preview", previewFrame->getFrame()); int key = cv::waitKey(1); if(key == 'q' || key == 'Q') return 0; } return 0; } |
Got questions?
We’re always happy to help with code or other questions you might have.