DepthAI Tutorials
DepthAI API References

ON THIS PAGE

  • PointCloud Visualization
  • Key Concepts
  • Demo
  • Setup
  • Source code

PointCloud Visualization

This example demonstrates how to visualize an on-device created point cloud using DepthAI and Open3D. It captures color and depth data from an OAK device, combines them to create a colored point cloud, and displays it in real-time. The FPS counter is included to monitor performance.

Key Concepts

  • Capturing synchronized color and depth frames using the DepthAI API.
  • Creating a point cloud with color data for visualization.
  • Real-time visualization of the point cloud using Open3D.
  • Implementing an FPS counter to monitor the application's performance.

Demo

Setup

If you are using Python, ensure you have DepthAI and Open3D installed in your Python environment:
Command Line
1python3 -m pip install depthai open3d
If Open3D is not already installed, the script will prompt you to install it.

Source code

The example initializes the DepthAI pipeline with color and mono cameras and a stereo depth node to generate depth information. It then creates a point cloud node and configures it for sparse point cloud generation. The script visualizes this point cloud using Open3D, with each point's color corresponding to the color image captured by the RGB camera.
Python
C++

Python

Python
GitHub
1import depthai as dai
2from time import sleep
3import numpy as np
4import cv2
5import time
6import sys
7try:
8    import open3d as o3d
9except ImportError:
10    sys.exit("Critical dependency missing: Open3D. Please install it using the command: '{} -m pip install open3d' and then rerun the script.".format(sys.executable))
11
12FPS = 30
13class FPSCounter:
14    def __init__(self):
15        self.frameCount = 0
16        self.fps = 0
17        self.startTime = time.time()
18
19    def tick(self):
20        self.frameCount += 1
21        if self.frameCount % 10 == 0:
22            elapsedTime = time.time() - self.startTime
23            self.fps = self.frameCount / elapsedTime
24            self.frameCount = 0
25            self.startTime = time.time()
26        return self.fps
27
28pipeline = dai.Pipeline()
29camRgb = pipeline.create(dai.node.ColorCamera)
30monoLeft = pipeline.create(dai.node.MonoCamera)
31monoRight = pipeline.create(dai.node.MonoCamera)
32depth = pipeline.create(dai.node.StereoDepth)
33pointcloud = pipeline.create(dai.node.PointCloud)
34sync = pipeline.create(dai.node.Sync)
35xOut = pipeline.create(dai.node.XLinkOut)
36xOut.input.setBlocking(False)
37
38
39camRgb.setResolution(dai.ColorCameraProperties.SensorResolution.THE_1080_P)
40camRgb.setBoardSocket(dai.CameraBoardSocket.CAM_A)
41camRgb.setIspScale(1,3)
42camRgb.setFps(FPS)
43
44monoLeft.setResolution(dai.MonoCameraProperties.SensorResolution.THE_400_P)
45monoLeft.setCamera("left")
46monoLeft.setFps(FPS)
47monoRight.setResolution(dai.MonoCameraProperties.SensorResolution.THE_400_P)
48monoRight.setCamera("right")
49monoRight.setFps(FPS)
50
51depth.setDefaultProfilePreset(dai.node.StereoDepth.PresetMode.HIGH_DENSITY)
52depth.initialConfig.setMedianFilter(dai.MedianFilter.KERNEL_7x7)
53depth.setLeftRightCheck(True)
54depth.setExtendedDisparity(False)
55depth.setSubpixel(True)
56depth.setDepthAlign(dai.CameraBoardSocket.CAM_A)
57
58monoLeft.out.link(depth.left)
59monoRight.out.link(depth.right)
60depth.depth.link(pointcloud.inputDepth)
61camRgb.isp.link(sync.inputs["rgb"])
62pointcloud.outputPointCloud.link(sync.inputs["pcl"])
63sync.out.link(xOut.input)
64xOut.setStreamName("out")
65
66
67
68
69with dai.Device(pipeline) as device:
70    isRunning = True
71    def key_callback(vis, action, mods):
72        global isRunning
73        if action == 0:
74            isRunning = False
75
76    q = device.getOutputQueue(name="out", maxSize=4, blocking=False)
77    vis = o3d.visualization.VisualizerWithKeyCallback()
78    vis.create_window()
79    vis.register_key_action_callback(81, key_callback)
80    pcd = o3d.geometry.PointCloud()
81    coordinateFrame = o3d.geometry.TriangleMesh.create_coordinate_frame(size=1000, origin=[0,0,0])
82    vis.add_geometry(coordinateFrame)
83
84    first = True
85    fpsCounter = FPSCounter()
86    while isRunning:
87        inMessage = q.get()
88        inColor = inMessage["rgb"]
89        inPointCloud = inMessage["pcl"]
90        cvColorFrame = inColor.getCvFrame()
91        # Convert the frame to RGB
92        cvRGBFrame = cv2.cvtColor(cvColorFrame, cv2.COLOR_BGR2RGB)
93        fps = fpsCounter.tick()
94        # Display the FPS on the frame
95        cv2.putText(cvColorFrame, f"FPS: {fps:.2f}", (10, 30), cv2.FONT_HERSHEY_SIMPLEX, 1, (0, 0, 255), 2)
96        cv2.imshow("color", cvColorFrame)
97        key = cv2.waitKey(1)
98        if key == ord('q'):
99            break
100        if inPointCloud:
101            t_before = time.time()
102            points = inPointCloud.getPoints().astype(np.float64)
103            pcd.points = o3d.utility.Vector3dVector(points)
104            colors = (cvRGBFrame.reshape(-1, 3) / 255.0).astype(np.float64)
105            pcd.colors = o3d.utility.Vector3dVector(colors)
106            if first:
107                vis.add_geometry(pcd)
108                first = False
109            else:
110                vis.update_geometry(pcd)
111        vis.poll_events()
112        vis.update_renderer()
113    vis.destroy_window()

Need assistance?

Head over to Discussion Forum for technical support or any other questions you might have.