DepthAI ROS Driver
Version Warning
This documentation is targeted to ROS2 Kilted and upwards. For ROS2 Jazzy/Humble/Noetic, please visit V2 Docs
Quickstart
Classic installation
Command Line
1sudo apt install ros-$ROS_DISTRO-depthai-rosCommand Line
1ros2 launch depthai_ros_driver driver.launch.pyCommand Line
1ros2 launch depthai_ros_driver driver.launch.py use_rviz:=trueuse_rviz:=true will work for all launch files that utilize driver.launch.pyNote, this launch file is targeted towards OAK-D lineup, for devices with different configurations such as OAK-D-PoE-SR or OAK-FFC you might need to use a different launch file or different parameter combination.Each file launches ROS driver as a composable node inside its own composable node container, for running the driver as a separate node you can run.Command Line
1ros2 run depthai_ros_driver driverPrebuilt Docker images
Command Line
1xhost +local:dockerCommand Line
1docker run -it -v /dev/:/dev/ --privileged -e DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix luxonis/depthai-ros:kilted-latestCommand Line
1docker run -it -v /dev/:/dev/ --privileged -e DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix luxonis/depthai-ros:kilted-latest [CMD]Command Line
1docker run -it -v /dev/:/dev/ --privileged -e DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix luxonis/depthai-ros:v3.0.9-kiltedCommand Line
1docker run -it -v /dev/:/dev/ --privileged -e DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix luxonis/depthai-ros:kilted-arm64-latestCommand Line
1docker run -it -v /dev/:/dev/ --privileged -e DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix luxonis/depthai-ros:v3.0.9-kilted-arm64/home/YOUR_USERNAME_HERE/params/example_config.yaml) and pass it as an argument to the executable, as follows:Command Line
1docker run -it -v /dev/:/dev/ -v /home/YOUR_USERNAME_HERE/params:/params --network host --privileged -e DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix depthai-ros ros2 launch depthai_ros_driver driver.launch.py params_file:=/params/example_config.yamlParameters
Launch files
driver.launch.pylaunches camera in RGBD, and NN in spatial mode (using Yolo V6).rgbd_pcl.launch.pylaunches camera in basic RGBD configuration, doesn't load any NNs. Also loads ROS depth processing nodes for RGBD pointcloud.example_multicam.launch.pylaunches several cameras at once, each one in different container. Edit themulticam_example.yamlconfig file inconfigdirectory to change parameters (currently only for RVC2)example_segmentation.launch.pylaunches camera in RGBD + semantic segmentation (pipeline type=RGBD, nn_type=rgb)pointcloud.launch.py- similar torgbd_pcl.launch.py, but doesn't use RGB component for pointcloud. Note, pointcloud here is published on a different topic.rtabmap.launch.pylaunches camera and RTAB-MAP RGBD SLAM (you need to install it first -sudo apt install ros-$ROS_DISTRO-rtabmap-ros). You might need to set manual focus via parameters here.sr_rgbd_pcl.launch.py- Should be used as a baseline for SR cameras due to only having two sensors.sr_poe_rgbd_pcl.launch.py- Should be used as a baseline for SR PoE camerasoak_t.launch.py- A launch file specific to OAK-T lineupcalibration.launch.py- Used to launch ROS calibration tool, this is mostly for testing, if you find that your camera has invalid calibration, either use our internal calibration tool or contact us directly.vio.launch.py- Launches camera with VIO pipeline enabled by default

Publishing TFs from extrinsics
- parent_frame: "oak_parent_frame"
- base_frame: "oak" (or node name)
oak_imu_frameBy default, camera transforms are published from based on Device's camera calibration data.When using driver.launch.py, you can set pass_tf_args_as_params:=true so that TF arguments are used to fill those parameters. For example ros2 launch depthai_ros_driver driver.launch.py pass_tf_args_as_params:=true parent_frame:=map cam_pos_x:=1.0 imu_from_descr:=trueIt is also possible to set custom URDF path (for now only absolute path works) and custom xacro arguments using driver.i_tf_custom_urdf_path and driver.i_tf_custom_xacro_args. Please note that robot_state_publisher must be running.Note on Extrinsics
If your camera has uncalibrated IMU, a warning will be shown, and IMU will be published with zero rotation and translation.You can override this behavior by setting
driver.i_tf_imu_from_descr: true. This will publish default IMU extrinsics from URDF based on camera model.Custom pipelines
test_plugins), create an executable in that package (test_plugins.cpp). Inside that file, define a custom plugin that inherits from depthai_ros_driver::pipeline_gen::BasePipeline and overrides createPipeline method.For exampleC++
1std::vector<std::unique_ptr<dai_nodes::BaseNode>> StereoToF::createPipeline(std::shared_ptr<rclcpp::Node> node,
2 std::shared_ptr<dai::Device> device,
3 std::shared_ptr<dai::Pipeline> pipeline,
4 std::shared_ptr<param_handlers::PipelineGenParamHandler> ph,
5 const std::string& deviceName,
6 bool rsCompat,
7 const std::string& /*nnType*/) {
8 std::vector<std::unique_ptr<dai_nodes::BaseNode>> daiNodes;
9 auto tof = std::make_unique<dai_nodes::ToF>("tof", node, pipeline, deviceName, rsCompat);
10 auto left = std::make_unique<dai_nodes::SensorWrapper>("left", node, pipeline, deviceName, rsCompat, dai::CameraBoardSocket::CAM_B);
11 auto right = std::make_unique<dai_nodes::SensorWrapper>("right", node, pipeline, deviceName, rsCompat, dai::CameraBoardSocket::CAM_C);
12 if(tof->isAligned() && tof->getSocketID() == right->getSocketID()) {
13 auto in = tof->getInput();
14 right->getDefaultOut()->link(in);
15 } else if(tof->isAligned() && tof->getSocketID() == left->getSocketID()) {
16 auto in = tof->getInput();
17 left->getDefaultOut()->link(in);
18 }
19 daiNodes.push_back(std::move(left));
20 daiNodes.push_back(std::move(right));
21 daiNodes.push_back(std::move(tof));
22 return daiNodes;
23}C++
1#include <pluginlib /class_list_macros.hpp>
2 PLUGINLIB_EXPORT_CLASS(test_plugins::Test, depthai_ros_driver::pipeline_gen::BasePipeline)Xml
1<library path="test_plugins">
2 <class type="test_plugins::Test" base_class_type="depthai_ros_driver::pipeline_gen::BasePipeline">
3 <description>Test Pipeline.</description>
4 </class>
5 </library>pipeline_gen.i_pipeline_type to test_plugins::Test.Specific camera configurations
PoE Cameras
subpixel depth filtering is disabled in this mode. To enable low_bandwidth, for example for rgb camera, change parameters:rgb.i_low_bandwidth-trueto enablergb.i_low_bandwidth_quality- desired quality % (default-50) Seelow_bandwidth.yamlfile for example parameters for all streams
ToF Cameras
sr_poe_rgbd_pcl.launch.py. This also utilizes pipeline configuration specific to OAK-D-SR, you can find more information about the pipeline types above. Please note that currently ToF cameras publish images in rgb frame due to sensor socket naming convention.Stopping/starting camera for power saving/reconfiguration
pipeline_gen.i_enable_diagnostics: true driver.i_restart_on_diagnostics_error: trueRealSense compatibility
driver.launch.py with rs_compat:=true argument to miminc Realsense camera behavior. This will set up camera with RGBD pipeline, and publish topics with names compatible with Realsense camera. TF frames will be also available in Realsense format. Similarly to how RS node works, you can enable pointcloud publishing with pointcloud.enable:=true argument. Please note that the driver will spawn with different name and namespace, as well as different DAI node names so you might need to adjust your config files accordingly. Naming convention is as follows:- Namespace:
''->camera - Node name:
oak->camera rgb->colorstereo->depth- 'left' ->
infra_2 - 'right' ->
infra_1 - Frame
oak->camera_link
Example
Python
1# Example inclusion of RealSense driver
2rs_camera_launch_include = IncludeLaunchDescription(
3 PythonLaunchDescriptionSource([
4 PathJoinSubstitution([
5 FindPackageShare('realsense2_camera'),
6 'launch',
7 'rs_launch.py',
8 ])
9 ]),
10 launch_arguments={
11 'pointcloud.enable': true,
12 }.items()
13)Python
1# Example inclusion of OAK driver
2oak_driver_launch_include = IncludeLaunchDescription(
3 PythonLaunchDescriptionSource([
4 PathJoinSubstitution([
5 FindPackageShare('depthai_ros_driver'),
6 'launch',
7 'driver.launch.py',
8 ])
9 ]),
10 launch_arguments={
11 'rs_compat': true,
12 'pointcloud.enable': true,
13 }.items()
14)RTABMap example with RealSense compatibility
rtabmap_ros package, realsense_d435i_color.launch.py.Note This example as of this moment (2024-08-27) needs to be adjusted for RS topic names as follows:Before
Python
1remappings=[
2('imu', '/imu/data'),
3('rgb/image', '/camera/color/image_raw'),
4('rgb/camera_info', '/camera/color/camera_info'),
5('depth/image', '/camera/realigned_depth_to_color/image_raw')]After
Python
1remappings=[
2('imu', '/imu/data'),
3('rgb/image', '/camera/camera/color/image_raw'),
4('rgb/camera_info', '/camera/camera/color/camera_info'),
5('depth/image', '/camera/camera/depth/image_rect_raw')]Command Line
1ros2 launch realsense2_camera rs_launch.py enable_gyro:=true enable_accel:=true unite_imu_method:=1 enable_sync:=trueCommand Line
1ros2 launch depthai_ros_driver driver.launch.py rs_compat:=truertabmap.launch.py example in depthai_ros_driver package for running RTABMap with OAK camera. You can also add slight modifications to also enable RS mode for this example.Recalibration
- Use
set_camera_infoservices available for each of the image streams - Use
i_calibration_fileparameter available to point to the calibration file. Note camera name must start with/, so for example/rgb. Seedepthai_ros_driver/config/calibrationfor example calibration files.calibration.launchfile is provided to start up a ROS camera calibrator node in both monocular and stereo configurations. Calibration file syntax (fromcamera_info_manager):
Yaml
1- file:///full/path/to/local/file.yaml
2- file:///full/path/to/videre/file.ini
3- package://camera_info_manager/tests/test_calibration.yaml
4- package://ros_package_name/calibrations/camera3.yamlDevelopers guide
- Create separate workspace
- Clone repository into src
- Copy
.devcontainerdirectory into main workspace directory - Open workspace directory in VSCode