# Multi-Device Setup

You can find [Demo scripts here](https://github.com/luxonis/oak-examples/tree/main/tutorials/multiple-devices). Learn how to
discover multiple OAK cameras connected to your system, and use them individually.

## Discovering OAK cameras

You can use DepthAI to discover all connected OAK cameras, either via USB or through the LAN (OAK POE cameras). The code snippet
below finds all OAK cameras and prints their DeviceIDs (unique identifier) and their XLink state.

```python
import depthai
for device in depthai.Device.getAllAvailableDevices():
    print(f"{device.getDeviceId()} {device.state}")
```

Example results for 3x DepthAI on a system:

```bash
14442C10D13EABCE00 XLinkDeviceState.X_LINK_UNBOOTED
14442C1071659ACD00 XLinkDeviceState.X_LINK_UNBOOTED
3604808376 XLinkDeviceState.X_LINK_GATE
```

## Selecting a Specific DepthAI device to be used

From the Detected devices(s) above, use the following code to select the device you would like to use with your pipeline. For
example, if the first device is desirable from above use the following code:

```python
# Specify DeviceID, IP Address or USB path
device_info = depthai.DeviceInfo("14442C108144F1D000") # DeviceID
#device_info = depthai.DeviceInfo("192.168.1.44") # IP Address
#device_info = depthai.DeviceInfo("3.3.3") # USB port name
with depthai.Device(device_info) as device:
    # ...
```

And you can use this code as a basis for your own use cases, such that you can run differing neural models on different OAK
models.

## Specifying POE device to be used

You can specify the POE device to be used by the IP address as well, as shown in the code snippet above.

Now use as many OAK cameras as you need! And since DepthAI does all the heavy lifting, you can usually use quite a few of them
with very little burden to the host.

## Timestamp syncing

Timestamp synchronization, alternatively referred to as message syncing, involves aligning messages from various sensors,
including frames, IMU packets, ToF data, and more.

> DepthAI 2.24 introduces Sync node which can be used to sync messages from different streams, or messages from different sensors
(eg. IMU and color frames). See Sync node for more details. The sync node does not currently support multiple device syncing, so
if you want to sync messages from multiple devices, you should use the manual approach.

More information about timestamp synchronization can be found on the [Frame synchronization
page](https://docs.luxonis.com/hardware/platform/deploy/frame-sync.md).

## Multi camera demo

```python
#!/usr/bin/env python3

import cv2
import depthai as dai
import contextlib

def createPipeline(pipeline):
    camRgb = pipeline.create(dai.node.Camera).build(dai.CameraBoardSocket.CAM_A)
    output = camRgb.requestOutput((1280, 800), dai.ImgFrame.Type.NV12 ,dai.ImgResizeMode.CROP, 20).createOutputQueue()
    return pipeline, output

with contextlib.ExitStack() as stack:
    deviceInfos = dai.Device.getAllAvailableDevices()
    print("=== Found devices: ", deviceInfos)
    queues = []
    pipelines = []

    for deviceInfo in deviceInfos:
        pipeline = stack.enter_context(dai.Pipeline())
        device = pipeline.getDefaultDevice()
        
        print("===Connected to ", deviceInfo.getDeviceId())
        mxId = device.getDeviceId()
        cameras = device.getConnectedCameras()
        usbSpeed = device.getUsbSpeed()
        eepromData = device.readCalibration2().getEepromData()
        print("   >>> Device ID:", mxId)
        print("   >>> Num of cameras:", len(cameras))
        if eepromData.boardName != "":
            print("   >>> Board name:", eepromData.boardName)
        if eepromData.productName != "":
            print("   >>> Product name:", eepromData.productName)
        
        pipeline, output = createPipeline(pipeline)
        pipeline.start()
        pipelines.append(pipeline)

        queues.append(output)

    while True:
        for i, stream in enumerate(queues):
            videoIn = stream.get()
            assert isinstance(videoIn, dai.ImgFrame)
            cv2.imshow(f"video_device{i}", videoIn.getCvFrame())
        if cv2.waitKey(1) == ord('q'):
            break
```

## Multi camera calibration

This example demonstrates how to compute extrinsic parameters (pose of the camera) for multiple cameras. It provides a practical
illustration of how to determine the relative positions and orientations of different cameras in a multi-camera setup. By
accurately estimating the extrinsic parameters, we can ensure that the images captured by each camera are correctly aligned and
can be effectively combined for further processing and analysis.

### Multiple camera calibration on GitHub

[Multiple camera calibration on
GitHub](https://github.com/luxonis/oak-examples/tree/main/tutorials/multiple-devices/multi-cam-calibration)
