# Host Camera

This example demonstrates how to use a host machine's camera (such as a laptop webcam) within a DepthAI pipeline using the
HostCamera node. This enables you to run your webcam (or any other camera connected to the host machine) as part of a DepthAI
pipeline - utilizing the power of RVC hardware for processing.

## How It Works

HostCamera class is a custom host node (link) that captures frames from the host machine camera using OpenCV. The captured frames
are then sent to the DepthAI pipeline as ImgFrame messages. The pipeline can then process these frames using other nodes, such as
neural networks, object trackers, etc.

The HostCamera node is a threaded host node, which means it runs in a separate thread from the main pipeline. This allows the
camera to capture frames independently of the rest of the pipeline, ensuring smooth operation.

This example requires the DepthAI v3 API, see [installation instructions](https://docs.luxonis.com/software-v3/depthai.md).

## Source Code

#### Python

```python
import depthai as dai
import cv2
import time

class HostCamera(dai.node.ThreadedHostNode):
    def __init__(self):
        super().__init__()
        self.output = self.createOutput()
    def run(self):
        # Create a VideoCapture object
        cap = cv2.VideoCapture(0)
        if not cap.isOpened():
            p.stop()
            raise RuntimeError("Error: Couldn't open host camera")
        while self.mainLoop():
            # Read the frame from the camera
            ret, frame = cap.read()
            if not ret:
                break
            # Create an ImgFrame message
            imgFrame = dai.ImgFrame()
            imgFrame.setData(frame)
            imgFrame.setWidth(frame.shape[1])
            imgFrame.setHeight(frame.shape[0])
            imgFrame.setType(dai.ImgFrame.Type.BGR888i)
            # Send the message
            self.output.send(imgFrame)
            # Wait for the next frame
            time.sleep(0.1)

with dai.Pipeline(createImplicitDevice=False) as p:
    hostCamera = p.create(HostCamera)
    camQueue = hostCamera.output.createOutputQueue()

    p.start()
    while p.isRunning():
        image : dai.ImgFrame = camQueue.get()
        cv2.imshow("HostCamera", image.getCvFrame())
        key = cv2.waitKey(1)
        if key == ord('q'):
            p.stop()
            break
```

#### C++

```cpp
#include <atomic>
#include <chrono>
#include <csignal>
#include <iostream>
#include <memory>
#include <opencv2/opencv.hpp>
#include <thread>

#include "depthai/depthai.hpp"

// Global flag for graceful shutdown
std::atomic<bool> quitEvent(false);

// Signal handler
void signalHandler(int signum) {
    quitEvent = true;
}

// Custom threaded host node for camera capture
class HostCamera : public dai::node::CustomThreadedNode<HostCamera> {
   public:
    HostCamera() {
        output = std::shared_ptr<dai::Node::Output>(new dai::Node::Output(*this, {"output", DEFAULT_GROUP, {{{dai::DatatypeEnum::ImgFrame, false}}}}));
    }

    void run() override {
        std::cout << "HostCamera running" << std::endl;
        // Create a VideoCapture object
        cv::VideoCapture cap(0);
        if(!cap.isOpened()) {
            std::cerr << "Error: Couldn't open host camera" << std::endl;
            stopPipeline();
            return;
        }

        while(mainLoop()) {
            // Read the frame from the camera
            cv::Mat frame;
            if(!cap.read(frame)) {
                break;
            }

            // Create an ImgFrame message
            auto imgFrame = std::make_shared<dai::ImgFrame>();

            std::vector<uchar> buffer(frame.data, frame.data + frame.total() * frame.elemSize());
            imgFrame->setData(buffer);
            imgFrame->setWidth(frame.cols);
            imgFrame->setHeight(frame.rows);
            imgFrame->setType(dai::ImgFrame::Type::BGR888i);

            // Send the message
            output->send(imgFrame);

            // Wait for the next frame
            std::this_thread::sleep_for(std::chrono::milliseconds(100));
        }

        cap.release();
    }

    std::shared_ptr<dai::Node::Output> output;
};

int main() {
    // Set up signal handlers
    signal(SIGTERM, signalHandler);
    signal(SIGINT, signalHandler);

    // Create pipeline without implicit device
    dai::Pipeline pipeline(false);

    // Create host camera node
    auto hostCamera = pipeline.create<HostCamera>();
    auto camQueue = hostCamera->output->createOutputQueue();

    // Start pipeline
    pipeline.start();
    std::cout << "Host camera started. Press 'q' to quit." << std::endl;

    while(pipeline.isRunning() && !quitEvent) {
        auto image = camQueue->get<dai::ImgFrame>();
        if(image == nullptr) continue;

        cv::imshow("HostCamera", image->getCvFrame());

        int key = cv::waitKey(1);
        if(key == 'q') {
            pipeline.stop();
            break;
        }
    }

    // Cleanup
    pipeline.stop();
    pipeline.wait();

    return 0;
}
```

### Need assistance?

Head over to [Discussion Forum](https://discuss.luxonis.com/) for technical support or any other questions you might have.
