The driver can be configured by a large number of parameters. Description of all parameters for each node can be found below, most of them mirror DepthAI C++/Python API. Parameters that begin with r_ can be freely modified during runtime, for example with rqt.Parameters that begin with i_ are set when camera is initializing, to change them you have to call stop and start services. This can be used to hot swap NNs during runtime, changing resolutions, etc.
Driver
driver.i_publish_tf_from_calibration: true <bool> - setting this to true launches TFPublisher
driver.i_tf_device_name: [NODE_NAME] <string> - if not set, defaults to the node name
driver.i_tf_device_model: '' <string> - if not set, it will be automatically detected. If the node is unable to detect STL file for the device it is set to OAK-D-S2. To explicitly set it in driver.launch.py, set override_cam_model:=true
driver.i_tf_base_frame: oak <string> - Frame that is used as an the root for camera frames, usually in the same place as CAM_A (center sensor for OAK-D lineup)
driver.i_tf_parent_frame: oak_parent_frame <string> - Set to the frame that you want to connect the camera to (for example robot end effector)
driver.i_tf_cam_pos_x: '0.0' <string> - Position of base frame w.r.t parent frame
driver.i_tf_cam_pos_y: '0.0' <string> - Position of base frame w.r.t parent frame
driver.i_tf_cam_pos_z: '0.0' <string> - Position of base frame w.r.t parent frame
driver.i_tf_cam_roll: '0.0' <string> - Orientation of base frame w.r.t parent frame
driver.i_tf_cam_pitch: '0.0' <string> - Orientation of base frame w.r.t parent frame
driver.i_tf_cam_yaw: '0.0' <string> - Orientation of base frame w.r.t parent frame Outside of TF-publishing related parameters driver has additional parameters
i_usb_speed: SUPER <string> - USB speed set for using current interface, it is automatically deduced but can be overridden, for example to use SUPER_PLUS explicitly available options:
LOW
FULL
HIGH (USB2)
SUPER (USB3)
SUPER_PLUS - for connecting with a hight speed interface (like Thunderbolt), can be unstable in certain scenarios
i_device_id: "" <string> - Used to connect to a device with a specific ID
i_ip "" <string> - Used to connect to a device with specific IP
i_usb_port_id: "" <string> - Used to connect to devices on specific USB port
i_pipeline_dump: false <bool> - Whether to dump created pipeline to a JSON file (used for debugging)
i_calibration_dump: false <bool> - Whether to dump calibration to a JSON file
i_external_calibration_path: "" <string> - Used to override device calibration
i_enable_ir : true <bool> - Whether to enable IR projector
i_restart_on_diagnostics_error: false <bool> - If diagnostics publishing is enabled in the pipeline and it reports an error, driver will be restarted
i_rs_compat: false <bool> - Turn on Realsense compatibility. Topic, frame names and param names will be changed to reflect RS cameras.
r_laser_dot_intensity: 0.6 <float> <0.0-1.0> - Sets laser dot projector intensity. Additional power might be required via Y-adapter in USB cameras
r_floodlight_intensity: 0.6 <float> <0.0-1.0> - Sets floodlight projector intensity. Additional power might be required via Y-adapter in USB cameras
Pipeline generation
Parameters start with name pipeline_gen
i_pipeline_type can be either
RGB - only publishes RGB stream , NN available
RGBD - Publishes RGB + Depth streams (set i_publish_topic for left and right cameras to enable them), NN & Spatial NN available
Stereo - Publishes streams from left and right sensors, NN can be set for each sensor separately
RGBStereo - Publishes RGB + Left + Right streams, NN can be set for each sensor separately
Depth - Publishes only depth stream, NN can be set on stereo node via parameter
CamArray - Publishes streams for all detected sensors, NN can be set for each sensor separately
Tof - Publishes only TOF stream, no NN available
DepthTof - Publishes Depth + TOF streams, specific to OAK-D-SR-POE, NN can be set on stereo node via parameter
StereoToF - Publishes Stereo + TOF streams, specific to OAK-D-SR-POE, NN can be set on stereo node via parameter
RGBToF - Publishes RGB + TOF streams, specific to OAK-D-SR-POE, NN can be set on stereo node via parameter
i_nn_type: spatial <string> - Type of NN to be created with pipelines
i_enable_imu: true <bool> - Whether to enable IMU node in certain pipelines
i_enable_diagnostics: false <bool> - Whether to enable diagnostics node
i_enable_rgbd: false <bool> - Whether to enable RGBD node in certain pipelines
Image sensors
Sensor parameters begin with sensor_name.[PARAMETER_NAME], for example rgb.i_width
Initial parameters
i_enable_default_output: true <bool>- For now, sensor node creates one default output that is used for image publishing and/or Stereo alignment.
i_board_socket_id <int> - Socket ID of the sensor, used mostly for alignment, calibration and deducing frame names, automatically detected for each sensor but can be overriden
i_enable_feature_tracker: false <bool> - Enables FeatureTracker node for this particular sensor
i_enable_nn: false <bool> - Enables NeuralNetwork for this particular sensor
i_max_q_size: 8 <int> - Sets DAI queue size
i_low_bandwidth: false <bool> - When set to true, enables VideoEncoder for this particular sensor to improve latency between camera and host. By default DAI Encoded frames (only in case of MJPEG Encoder, for H264 enable i_publish_compressed parameter) are decoded to ROS Image messages on host.
i_low_bandwidth_profile: 4 <int> - Type of encoder used. Following types are available:
H264_BASELINE,
H264_HIGH,
H264_MAIN,
H265_MAIN,
MJPEG (default)
i_low_bandwidth_frame_freq: 30 <int> - Frame frequency of the Encoder, used in H26X types
i_low_bandwidth_bitrate: 0 <int> - Bitrate of the Encoder, used in H26X types
i_low_bandwidth_quality: 50 <int> - Quality of the MJPEG Encoder
i_publish_compressed: false <bool> - Instead of converting DAI Encoded frames to ROS Image messages publish them in respective compressed image formats (CompressedImage for MJPEG and FFMPEGMsgs::FFMPPEGPacket for H26X types)
i_get_base_device_timestamp: false <bool> - Controls whether DAI timestamps obtained for data are synchronized to host (Timestamps themselves are still aligned to ROS time)
i_calibration_file: "" <string> - Used to override DAI calibration with ROS calibration
i_update_ros_base_time_on_ros_msg: false <bool> - Enable if you experience timeshifts in your system (for example in the case where system clock is externally controlled
i_enable_lazy_publisher: true <bool> - Whether to publish data only in the case where a subscription is detected on an image/camera_info topic
i_add_exposure_offset: false <bool> - Whether to add exposure offset to ROS timestamp
r_set_auto_exp_region: false <bool> - Whether to use part of image for controlling exposure
r_auto_exp_region_start_x: 0 <int> - Start point for rectangle used for autoexposure (in image coordinates, i.e X right, Y down)
r_auto_exp_region_start_y: 0 <int> - Start point for rectangle used for autoexposure (in image coordinates, i.e X right, Y down)
r_auto_exp_region_width: 0 <int> - Width of the autoexposure region
r_auto_exp_region_height: 0 <int> - Height of the autoexposure region
Stereo parameters
Depth alignment
When setting stereo.aligned: true, stereo output is aligned to board socket specified by stereo.i_board_socket_id parameter (by default 0/CAM_A)You can enable rectified Stereo streams by setting, for example in the case of right stream i_publish_right_rect: true.
Stereo socket order
By default, right camera is treated as first when used for stereo computation, which is reflected in CameraInfo messages. If you want to reverse that logic, set stereo.i_reverse_stereo_socket_order: true (this can be also set for individual sensors).
Custom Sensor sockets
Configuration of which sensors are used for computing stereo pair can be done either programmatically, by specifying them in a constructor of a Stereo node (for example when building a custom pipeline), or via parameters - stereo.i_left_socket_id/stereo.i_right_socket_id. Please note that currently if you want to use rgb/center socket instead of one of the given pairs you will need to build a custom pipeline for that.
i_low_bandwidth: false <bool> - Refer to the parameter description in Sensor node. NOTE This parameter is incompatible with i_subpixel parameter.
i_low_bandwidth_profile: 4 <int> - Type of encoder used. Following types are available:
H264_BASELINE,
H264_HIGH,
H264_MAIN,
H265_MAIN,
MJPEG (default)
i_low_bandwidth_frame_freq: 30 <int> - Frame frequency of the Encoder, used in H26X types
i_low_bandwidth_bitrate: 0 <int> - Bitrate of the Encoder, used in H26X types
i_low_bandwidth_quality: 50 <int> - Quality of the MJPEG Encoder
i_get_base_device_timestamp: false <bool> - Controls whether DAI timestamps obtained for data are synchronized to host (Timestamps themselves are still aligned to ROS time)
i_calibration_file: "" <string> - Used to override DAI calibration with ROS calibration
i_update_ros_base_time_on_ros_msg: false <bool> - Enable if you experience timeshifts in your system (for example in the case where system clock is externally controlled
i_enable_lazy_publisher: true <bool> - Whether to publish data only in the case where a subscription is detected on an image/camera_info topic
i_add_exposure_offset: false <bool> - Whether to add exposure offset to ROS timestamp
i_reverse_stereo_socket_order: false <bool> - See Stereo socket order section below for more information
i_synced: :false <bool> - Whether to add the output to the synced outputs group, all publishers synced will publish images with the same timestamps
i_publish_topic: true/false <bool> - Whether to publish on ROS topic, by default true on sensors on CAM_A socket, false otherwise
i_output_disparity: false <bool> - Whether to output disparity instead of depth
i_enable_left_spatial_nn: false <bool> - Whether to create spatial NN with left sensor as RGB source
i_enable_right_spatial_nn: false <bool> - Whether to create spatial NN with right sensor as RGB source
i_enable_left_rgbd: false <bool> - Whether to create RGBD node with left sensor as RGB source
i_enable_right_rgbd: false <bool> - Whether to create RGBD node with right sensor as RGB source
i_aligned: true <bool> - Whether to align stereo node to i_board_socket_id
i_run_align_on_host: true <bool> - Whether to run align node on host (this is only in case of RVC4)
i_depth_preset: FAST_ACCURACY <string> - What kind of default depth preset to apply. WARNING In case of types other than FAST_ACCURACY, other stereo filtering parameters might be overriden. Additionally, in some cases decimation filter is applied resulting in lower final image size and increased processing might reduce available resources that are shared with, for example Neural Networks on RVC2 devices. Following settings are exposed:
FAST_ACCURACY
DEFAULT
FACE
HIGH_DETAIL
ROBOTICS
i_enable_distortion_correction: true <bool> - Toggling distortion correction on output stream
i_set_disparity_to_depth_use_spec_translation: false <bool> - Whether to use baseline from CAD data, not the calibration data
i_bilateral_sigma: 0 <int> <0-65535> - Value for bilateral sigma filter. Large values of this parameter means that farther colors within pixel neighbourhood will be mixed together, resulting in large areas of semi-equal color.
i_stereo_conf_threshold: 15 <int> <0-255> - Confidence threshold for disparity calculation. High value means more depth values are discarded
i_subpixel: true <bool> - Compute disparity with sub-pixel interpolation (3 fractional bits by default. Suitable for long range. Currently incompatible with extended disparity and low_bandwidth mode. Setting subpixel results in more depth values being output (less visible "steps" in pointclouds at the cost of more bandwidth and resources being used. Automatically enabled on some depth presets.
i_subpixel_fractional_bits: 3 <int> <3-5> - Defines the number of fractional disparities: 2^x
i_rectify_edge_fill_color: 0 <int> <-1-255> - Fill color for missing data at the edges. 0-255 for grayscale or -1 to replicate pixels
i_alpha_scaling: 0.0 <float> - Alpha scaling parameter. It is a free scaling parameter between 0, (where all pixels in the undistorted image are valid) and 1 (where all the source image pixels are retained in the undistorted image). On some high distortion lenses, and/or due to rectification (image rotated) invalid areas may appear even with alpha=0, in these cases alpha < 0.0 helps removing invalid areas.
i_extended_disp: false <bool> - Disparity range increased from 0-95 to 0-190, combined with full resolution and downscaled images. Suitable for short range objects. Currently incompatible with sub-pixel disparity.
i_enable_companding: false <bool> - Enable disparity companding using sparse matching. Matching pixel by pixel for N disparities. Matching every 2nd pixel for M disparities. Matching every 4th pixel for T disparities. In case of 96 disparities: N=48, M=32, T=16. This way the search range is extended to 176 disparities, by sparse matching. Note, when enabling this flag only depth map will be affected, disparity map will not
i_temporal_filter_alpha: 0.4 <float> <0.0-1.0> - The alpha factor in an exponential moving average with Alpha=1 - no filter. Alpha=0 - infinite filter. Determines the extent of the temporal history that should be averaged.
i_temporal_filter_delta: 20 <int> - Step-size boundary. Establishes the threshold used to preserve "edges". If the disparity value between neighboring pixels exceed the disparity threshold set by this delta parameter, then filtering will be temporarily disabled. Default value 0 means auto: 3 disparity integer levels. In case of subpixel mode it's 3*number of subpixel levels.
i_temporal_filter_persistency: VALID_2_IN_LAST_4 <string> - Persistency mode. If the current disparity/depth value is invalid, it will be replaced by an older value, based on persistency mode. Valid values:
i_enable_disparity_shift: false <bool> - Shift input frame by a number of pixels to increase minimum depth. For example shifting by 48 will change effective disparity search range from (0,95] to [48,143]. An alternative approach to reducing the minZ. We normally only recommend doing this when it is known that there will be no objects farther away than MaxZ, such as having a depth camera mounted above a table pointing down at the table surface.
i_spatial_filter_hole_filling_radius: 2 <int> - An in-place heuristic symmetric hole-filling mode applied horizontally during the filter passes. Intended to rectify minor artefacts with minimal performance impact. Search radius for hole filling.
i_spatial_filter_alpha: 0.5 <float> - The Alpha factor in an exponential moving average with Alpha=1 - no filter. Alpha = 0 - infinite filter. Determines the amount of smoothing.
i_spatial_filter_delta: 20 <int> - Similar to delta in temporal filter.
i_spatial_filter_num_iterations: 1 <int> - Number of iterations over the image in both horizontal and vertical direction.
i_brightness_filter_min_brightness: 0 <int> - Minimum pixel brightness. If input pixel is less or equal than this value the depth value is invalidated.
i_brightness_filter_max_brightness: 255 <int> - Maximum pixel brightness. If input pixel is more or equal than this value the depth value is invalidated.
i_decimation_filter_decimation_mode: PIXEL_SKIPPING <string> - Decimation algorithm type. Available values:
PIXEL_SKIPPING
NON_ZERO_MEDIAN
NON_ZERO_MEAN
i_decimation_filter_decimation_factor: 1 <int> <1-4> - Valid values are 1,2,3,4. Disparity/depth map x/y resolution will be decimated with this value.
Following parameters are for the same for left and right rectified streams, below are parameters for left for the sake of brevity
i_publish_left_rect: false <bool> - Whether to publish rectified left stream
i_left_rect_low_bandwidth: false <bool> - Left rect low bandwidth provile
i_left_rect_low_bandwidth_profile: 4 <int> - Left Rect LB profile
i_left_rect_low_bandwidth_frame_freq: 30 <int> - Left Rect LB frame frequency
i_left_rect_low_bandwidth_bitrate: 0 <int> - Left Rect LB bitrate
i_left_rect_low_bandwidth_quality: 50 <int> - Left Rect LB quality
i_left_rect_publish_compressed: false <bool> - Whether to publish compressed images for left rect
i_left_rect_add_exposure_offset: false <bool> - Left Rect add exposure offset
i_left_rect_exposure_offset: 0 <int> - Left Rect exposure offset setting
i_left_rect_enable_feature_tracker: false <bool> - Whether to add feature tracker for rectified left
i_left_rect_synced: false <bool> - Whether to sync left rect with other streams
Feature Tracker
Each sensor node (and rectified streams from Stereo node) has the option to add FeatureTracker node, which publishes depthai_ros_msgs/msg/TrackedFeatures messages. To enable features on, for example rgb node, set rgb: i_enable_feature_tracker: true. To enable publishing on rectified streams, set for example stereo: i_left_rect_enable_feature_tracker
Parameters
i_motion_estimation_type: LUKAS_KANADE_OPTICAL_FLOW <string> - Motion estimator type, can be:
LUKAS_KANADE_OPTICAL_FLOW
HW_MOTION_ESTIMATION
i_update_ros_base_time_on_ros_msg: false <bool> - Enable if you experience timeshifts in your system (for example in the case where system clock is externally controlled
i_get_base_device_timestamp: false <bool> - Controls whether DAI timestamps obtained for data are synchronized to host (Timestamps themselves are still aligned to ROS time)
IMU
Parameters:
i_max_q_size: 8 <int> - Sets DAI queue size
i_acc_freq: 480 <int> - Accelerometer sensor frequency
i_enable_rotation: false <bool> - Whether to enable rotation vector & magnetometer data (available when using IMU_WITH_MAGN/ IMU_WITH_MAGN_SPLIT message type and BNO086 sensor)
i_gyro_cov: 0.0 <float> - Gyroscope covariance
i_gyro_freq: 400 <int> - Gyroscope frequency
i_mag_cov: 0.0 <float> - Magnetometer covariance
i_mag_freq: 100 <int> - Magnetometer frequency
i_max_batch_reports: 10 <int> - Max reports per batch
IMU_WITH_MAG_SPLIT - two publishers - sensor_msgs/Imu & sensor_msgs/MagneticField
i_rot_cov: -1.0 <float> - Rotation covariance
i_rot_freq: 400 <int> - Rotation frequency
i_sync_method: COPY <string> - sync method. Available options (WARNING RVC4 Devices only support COPY for now):
COPY
LINEAR_INTERPOLATE_GYRO
LINEAR_INTERPOLATE_ACCEL
i_rotation_vector_type <string> - type of rotation vector, for more information, refer to this link. Available options:
ROTATION_VECTOR
GAME_ROTATION_VECTOR
GEOMAGNETIC_ROTATION_VECTOR
ARVR_STABILIZED_ROTATION_VECTOR
ARVR_STABILIZED_GAME_ROTATION_VECTOR
i_update_ros_base_time_on_ros_msg: false <bool> - Enable if you experience timeshifts in your system (for example in the case where system clock is externally controlled
i_get_base_device_timestamp: false <bool> - Controls whether DAI timestamps obtained for data are synchronized to host (Timestamps themselves are still aligned to ROS time)
Neural networks
In most of the pipeline types there is an option to easily turn enable Neural Networks.
camera.i_nn_type can be either none (no NN), rgb or spatial. This is responsible for whether the NN that we load should also take depth information (and for example provide detections in 3D format). Default set to spatial
Parameters
i_nn_family: detection <string> - sets what kind of NN we want to have. Currently available options are: *detection
segmentation (for now only on RVC2)
i_nn_model: yolov6-nano <string> - Which model we want to load, models are downloaded from the web automatically when the driver is started, similarly to DepthAI C++/Python examples.
i_nn_confidence_threshold: 0.5 <float> - Confidence threshold for NN
i_nn_depth_lower_threshold: 100 <int> - (Spatial only) lower depth threshold for detections (in mm)
i_nn_depth_upper_threshold: 10000 <int> - (Spatial only) upper depth threshold for detections (in mm)
i_enable_passthrough: false <bool> - Whether to publish RGB passthrough image
i_enable_passthrough_depth: false <bool> - Whether to publish depth passthrough image
i_update_ros_base_time_on_ros_msg: false <bool> - Enable if you experience timeshifts in your system (for example in the case where system clock is externally controlled
i_get_base_device_timestamp: false <bool> - Controls whether DAI timestamps obtained for data are synchronized to host (Timestamps themselves are still aligned to ROS time)
You can also choose to run NN on a specific sensor by setting sensor_name.i_enable_nn to true. After that, you can similarly set sensor_name_nn.i_nn_model to a path to a config file. You can also set up spatial NN for Stereo node and use one of its left/right streams as rgb input. This can be enabled in cameras with RGB side sensors such as OAK-D-SR and OAK-D-LR. Setting up is also similar:
stereo.i_enable_left_spatial_nn - set to true to enable spatial NN with left sensor as input
Sync
Currently you can sync any number of image topics with a sync node wrapper. To create a sync node, set pipeline_gen: i_enable_sync: true, and, for example rgb.i_synced: true and stereo.i_synced: true. This example configuration is applied when running driver.launch.py with pointcloud.enable:=true argument.
Parameters
i_sync_threshold: 10 <int> - Set maximal interval between messages in a group in milliseconds.
i_sync_attempts: 10 <int> - Number of attemps for syncing
ToF
ToF node is available on ToF, RgbTof, StereoToF and DepthTof pipelines. Its parameters are similar to regular camera node
Parameters
i_publish_topic: true <bool> - Whether to publish topic
i_max_q_size: 8 <int> - Sets DAI queue size
i_low_bandwidth: false <bool> - Refer to the parameter description in Sensor node. NOTE This parameter is incompatible with i_subpixel parameter.
i_low_bandwidth_profile: 4 <int> - Type of encoder used. Following types are available:
H264_BASELINE,
H264_HIGH,
H264_MAIN,
H265_MAIN,
MJPEG (default)
i_low_bandwidth_frame_freq: 30 <int> - Frame frequency of the Encoder, used in H26X types
i_low_bandwidth_bitrate: 0 <int> - Bitrate of the Encoder, used in H26X types
i_low_bandwidth_quality: 50 <int> - Quality of the MJPEG Encoder
i_get_base_device_timestamp: false <bool> - Controls whether DAI timestamps obtained for data are synchronized to host (Timestamps themselves are still aligned to ROS time)
i_calibration_file: "" <string> - Used to override DAI calibration with ROS calibration
i_update_ros_base_time_on_ros_msg: false <bool> - Enable if you experience timeshifts in your system (for example in the case where system clock is externally controlled
i_enable_lazy_publisher: true <bool> - Whether to publish data only in the case where a subscription is detected on an image/camera_info topic
i_publish_compressed: false <bool> - Whether to publish compressed images
i_update_ros_base_time_on_ros_msg: false <bool> - Enable if you experience timeshifts in your system (for example in the case where system clock is externally controlled
i_get_base_device_timestamp: false <bool> - Controls whether DAI timestamps obtained for data are synchronized to host (Timestamps themselves are still aligned to ROS time)
i_add_exposure_offset: false <bool> - Whether to add exposure offset to ROS timestamp
i_aligned :false <bool> - Whether to align the image to the other sensor
i_aligned_socket_id: "" <string> - Socket to align the image to
i_run_align_on_host: true <bool> - Whether to run align node on host (this is only in case of RVC4)
i_synced: :false <bool> - Whether to add the output to the synced outputs group, all publishers synced will publish images with the same timestamps
i_preset_mode: TOF_MID_RANGE <string> - Preset mode for image filtering
Thermal
Thermal camera similarly operates as camera node with the option to set some additional runtime parameters
Parameters
i_publish_topic: true <bool> - Whether to publish topic
i_max_q_size: 8 <int> - Sets DAI queue size
i_low_bandwidth: false <bool> - Refer to the parameter description in Sensor node. NOTE This parameter is incompatible with i_subpixel parameter.
i_low_bandwidth_profile: 4 <int> - Type of encoder used. Following types are available:
H264_BASELINE,
H264_HIGH,
H264_MAIN,
H265_MAIN,
MJPEG (default)
i_low_bandwidth_frame_freq: 30 <int> - Frame frequency of the Encoder, used in H26X types
i_low_bandwidth_bitrate: 0 <int> - Bitrate of the Encoder, used in H26X types
i_low_bandwidth_quality: 50 <int> - Quality of the MJPEG Encoder
i_get_base_device_timestamp: false <bool> - Controls whether DAI timestamps obtained for data are synchronized to host (Timestamps themselves are still aligned to ROS time)
i_calibration_file: "" <string> - Used to override DAI calibration with ROS calibration
i_update_ros_base_time_on_ros_msg: false <bool> - Enable if you experience timeshifts in your system (for example in the case where system clock is externally controlled
i_enable_lazy_publisher: true <bool> - Whether to publish data only in the case where a subscription is detected on an image/camera_info topic
i_publish_compressed: false <bool> - Whether to publish compressed images
i_update_ros_base_time_on_ros_msg: false <bool> - Enable if you experience timeshifts in your system (for example in the case where system clock is externally controlled
i_get_base_device_timestamp: false <bool> - Controls whether DAI timestamps obtained for data are synchronized to host (Timestamps themselves are still aligned to ROS time)
i_add_exposure_offset: false <bool> - Whether to add exposure offset to ROS timestamp
r_time_noise_filter_level: 0 <int> <0-3> - Time noise filter level. Filters out the noise that appears over time.
RGBD
RGBD node can be turned on in some pipelines by turning on pipeline_gen.i_enable_rgbd parameter or by setting i_enable_left/right_rgbd in Stereo node.
Parameters
i_publish_topic: true <bool> - Whether to publish topic
i_max_q_size: 2 <int> - Sets DAI queue size
i_update_ros_base_time_on_ros_msg: false <bool> - Enable if you experience timeshifts in your system (for example in the case where system clock is externally controlled
i_get_base_device_timestamp: false <bool> - Controls whether DAI timestamps obtained for data are synchronized to host (Timestamps themselves are still aligned to ROS time)
i_num_threads: 1 <int> - Number of threads used for RGBD node
i_run_sync_on_host: true <bool> - Whether to sync messages on host
i_run_align_on_host: true <bool> - Whether to run align node on host (this is only in case of RVC4)
VIO
VIO node can be turned on in some pipelines by turning on pipeline_gen.i_enable_vio parameter.
Parameters
i_publish_topic: true <bool> - Whether to publish topic
i_max_q_size: 2 <int> - Sets DAI queue size
i_update_ros_base_time_on_ros_msg: false <bool> - Enable if you experience timeshifts in your system (for example in the case where system clock is externally controlled
i_get_base_device_timestamp: false <bool> - Controls whether DAI timestamps obtained for data are synchronized to host (Timestamps themselves are still aligned to ROS time)
i_run_sync_on_host: true <bool> - Whether to sync messages on host
i_imu_update_rate: 400 <int> - Rate of updating the imu messages
i_width: 640 <int> - Width of the images
i_height: 400 <int> - Height of the images
i_fps: 60.0 <float> - FPS of the images
i_config_path: 'depthai_ros_driver_default.json' <string> - Full path to config.json used, when default value is used, config from depthai_ros_driver/config/custom is used
i_override_local_transform: false <bool> - Whether to override local transform, when enabled, following parameters can be set
i_local_transform_x: 0.0 <float> - X position
i_local_transform_y: 0.0 <float> - Y position
i_local_transform_z: 0.0 <float> - Z position
i_local_transform_qx: 0.0 <float> - X component of quaternion
i_local_transform_qy: 0.0 <float> - Y component of quaternion
i_local_transform_qz: 0.0 <float> - Z component of quaternion
i_local_transform_qw: 0.0 <float> - w component of quaternion
i_override_imu_extrinsics: false <bool> - Whether to override IMU extrinsics with other values
i_imu_extr_x: 0.0 <float> - X position
i_imu_extr_y: 0.0 <float> - Y position
i_imu_extr_z: 0.0 <float> - Z position
i_imu_extr_qx: 0.0 <float> - X component of quaternion
i_imu_extr_qy: 0.0 <float> - Y component of quaternion
i_imu_extr_qz: 0.0 <float> - Z component of quaternion
i_imu_extr_qw: 0.0 <float> - w component of quaternion