DepthAI
  • DepthAI Components
    • AprilTags
    • Benchmark
    • Camera
    • Calibration
    • DetectionNetwork
    • Events
    • FeatureTracker
    • Gate
    • HostNodes
    • ImageAlign
    • ImageManip
    • IMU
    • Misc
    • Model Zoo
    • NeuralDepth
    • NeuralNetwork
    • ObjectTracker
    • RecordReplay
    • RGBD
    • Script
    • SpatialDetectionNetwork
    • SpatialLocationCalculator
    • StereoDepth
    • Sync
    • VideoEncoder
    • Visualizer
    • Warp
    • RVC2-specific
  • Advanced Tutorials
  • API Reference
  • Tools
Software Stack

ON THIS PAGE

  • How It Works
  • Source Code

Host Camera

Supported on:RVC2RVC4
This example demonstrates how to use a host machine's camera (such as a laptop webcam) within a DepthAI pipeline using the HostCamera node. This enables you to run your webcam (or any other camera connected to the host machine) as part of a DepthAI pipeline - utilizing the power of RVC hardware for processing.

How It Works

HostCamera class is a custom host node (link) that captures frames from the host machine camera using OpenCV. The captured frames are then sent to the DepthAI pipeline as ImgFrame messages. The pipeline can then process these frames using other nodes, such as neural networks, object trackers, etc.The HostCamera node is a threaded host node, which means it runs in a separate thread from the main pipeline. This allows the camera to capture frames independently of the rest of the pipeline, ensuring smooth operation.This example requires the DepthAI v3 API, see installation instructions.

Source Code

Python

Python
GitHub
1import depthai as dai
2import cv2
3import time
4
5
6class HostCamera(dai.node.ThreadedHostNode):
7    def __init__(self):
8        super().__init__()
9        self.output = self.createOutput()
10    def run(self):
11        # Create a VideoCapture object
12        cap = cv2.VideoCapture(0)
13        if not cap.isOpened():
14            p.stop()
15            raise RuntimeError("Error: Couldn't open host camera")
16        while self.mainLoop():
17            # Read the frame from the camera
18            ret, frame = cap.read()
19            if not ret:
20                break
21            # Create an ImgFrame message
22            imgFrame = dai.ImgFrame()
23            imgFrame.setData(frame)
24            imgFrame.setWidth(frame.shape[1])
25            imgFrame.setHeight(frame.shape[0])
26            imgFrame.setType(dai.ImgFrame.Type.BGR888i)
27            # Send the message
28            self.output.send(imgFrame)
29            # Wait for the next frame
30            time.sleep(0.1)
31
32with dai.Pipeline(createImplicitDevice=False) as p:
33    hostCamera = p.create(HostCamera)
34    camQueue = hostCamera.output.createOutputQueue()
35
36    p.start()
37    while p.isRunning():
38        image : dai.ImgFrame = camQueue.get()
39        cv2.imshow("HostCamera", image.getCvFrame())
40        key = cv2.waitKey(1)
41        if key == ord('q'):
42            p.stop()
43            break

C++

1#include <atomic>
2#include <chrono>
3#include <csignal>
4#include <iostream>
5#include <memory>
6#include <opencv2/opencv.hpp>
7#include <thread>
8
9#include "depthai/depthai.hpp"
10
11// Global flag for graceful shutdown
12std::atomic<bool> quitEvent(false);
13
14// Signal handler
15void signalHandler(int signum) {
16    quitEvent = true;
17}
18
19// Custom threaded host node for camera capture
20class HostCamera : public dai::node::CustomThreadedNode<HostCamera> {
21   public:
22    HostCamera() {
23        output = std::shared_ptr<dai::Node::Output>(new dai::Node::Output(*this, {"output", DEFAULT_GROUP, {{{dai::DatatypeEnum::ImgFrame, false}}}}));
24    }
25
26    void run() override {
27        std::cout << "HostCamera running" << std::endl;
28        // Create a VideoCapture object
29        cv::VideoCapture cap(0);
30        if(!cap.isOpened()) {
31            std::cerr << "Error: Couldn't open host camera" << std::endl;
32            stopPipeline();
33            return;
34        }
35
36        while(mainLoop()) {
37            // Read the frame from the camera
38            cv::Mat frame;
39            if(!cap.read(frame)) {
40                break;
41            }
42
43            // Create an ImgFrame message
44            auto imgFrame = std::make_shared<dai::ImgFrame>();
45
46            std::vector<uchar> buffer(frame.data, frame.data + frame.total() * frame.elemSize());
47            imgFrame->setData(buffer);
48            imgFrame->setWidth(frame.cols);
49            imgFrame->setHeight(frame.rows);
50            imgFrame->setType(dai::ImgFrame::Type::BGR888i);
51
52            // Send the message
53            output->send(imgFrame);
54
55            // Wait for the next frame
56            std::this_thread::sleep_for(std::chrono::milliseconds(100));
57        }
58
59        cap.release();
60    }
61
62    std::shared_ptr<dai::Node::Output> output;
63};
64
65int main() {
66    // Set up signal handlers
67    signal(SIGTERM, signalHandler);
68    signal(SIGINT, signalHandler);
69
70    // Create pipeline without implicit device
71    dai::Pipeline pipeline(false);
72
73    // Create host camera node
74    auto hostCamera = pipeline.create<HostCamera>();
75    auto camQueue = hostCamera->output->createOutputQueue();
76
77    // Start pipeline
78    pipeline.start();
79    std::cout << "Host camera started. Press 'q' to quit." << std::endl;
80
81    while(pipeline.isRunning() && !quitEvent) {
82        auto image = camQueue->get<dai::ImgFrame>();
83        if(image == nullptr) continue;
84
85        cv::imshow("HostCamera", image->getCvFrame());
86
87        int key = cv::waitKey(1);
88        if(key == 'q') {
89            pipeline.stop();
90            break;
91        }
92    }
93
94    // Cleanup
95    pipeline.stop();
96    pipeline.wait();
97
98    return 0;
99}

Need assistance?

Head over to Discussion Forum for technical support or any other questions you might have.