UVC & Mono Camera

This example demonstrates how to use a mono camera on your OAK device to function as a webcam. The UVC feature allows you to use your OAK device as a regular webcam in applications like OpenCV’s cv2.VideoCapture(), native camera apps, and more.

How It Works:

The MonoCamera node outputs image data in the GRAY8 format. However, the UVC node expects the data in NV12 format. To bridge this gap, an intermediary ImageManip node is used to convert the GRAY8 output from the MonoCamera node to NV12 format, which is then passed to the UVC node for streaming.

Similar samples:

Setup

Please run the install script to download all required dependencies. Please note that this script must be ran from git context, so you have to download the depthai-python repository first and then run the script

git clone https://github.com/luxonis/depthai-python.git
cd depthai-python/examples
python3 install_requirements.py

For additional information, please follow installation guide

Code used for testing

import cv2

# Initialize the VideoCapture object to use the default camera (camera index 0 is webcam)
cap = cv2.VideoCapture(1)

# Check if the camera opened successfully
if not cap.isOpened():
    print("Error: Could not open camera.")
    exit()

# Loop to continuously get frames from the camera
while True:
    ret, frame = cap.read()

    if not ret:
        print("Error: Could not read frame.")
        break

    cv2.imshow('Video Feed', frame)

    if cv2.waitKey(1) & 0xFF == ord('q'):
        break

cap.release()
cv2.destroyAllWindows()

Source code

Also available on GitHub

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
#!/usr/bin/env python3

import time

import depthai as dai

pipeline = dai.Pipeline()

# Define a source - left mono (grayscale) camera
mono_left = pipeline.createMonoCamera()

mono_left.setResolution(dai.MonoCameraProperties.SensorResolution.THE_720_P)
mono_left.setBoardSocket(dai.CameraBoardSocket.CAM_B)

# Create an UVC (USB Video Class) output node
uvc = pipeline.createUVC()

# Manip for frame type conversion
manip = pipeline.createImageManip()
manip.initialConfig.setResize(1280, 720)
manip.initialConfig.setFrameType(dai.RawImgFrame.Type.NV12)
manip.setMaxOutputFrameSize(int(1280*720*1.5))

# Linking
manip.out.link(uvc.input)
mono_left.out.link(manip.inputImage)

# Note: if the pipeline is sent later to device (using startPipeline()),
# it is important to pass the device config separately when creating the device
config = dai.Device.Config()
config.board.uvc = dai.BoardConfig.UVC(1280, 720)
config.board.uvc.frameType = dai.ImgFrame.Type.NV12
# config.board.uvc.cameraName = "My Custom Cam"
pipeline.setBoardConfig(config.board)


# Standard UVC load with depthai
with dai.Device(pipeline) as device:
    # Dot projector
    device.setIrLaserDotProjectorBrightness(765)
    print("\nDevice started, please keep this process running")
    print("and open an UVC viewer to check the camera stream.")
    print("\nTo close: Ctrl+C")

    # Doing nothing here, just keeping the host feeding the watchdog
    while True:
        try:
            time.sleep(0.1)
        except KeyboardInterrupt:
            break

Got questions?

Head over to Discussion Forum for technical support or any other questions you might have.