RAE ROS
Setting up procedure
SSH
- Connect via USB cable or wifi
rae-<ID>, passwordwifiwifi@(See RAE getting started documentation). - To use SHH without typing password each time -
ssh-copy-id root@192.168.11.1. Also a easier solution for USB devices, can append to~/.ssh/configthe following, entering just withssh ku:
Plain Text
1Host ku 192.168.197.55
2 HostName 192.168.197.55
3 User root
4 StrictHostKeyChecking no
5 UserKnownHostsFile /dev/null- Currently date resets after each startup to set current - ssh root@192.168.11.1 sudo date -s @
( date -u +"%s" ) - If you want to run ROS packages while bypassing Luxonis Hub it would be advised to stop RH agent before starting docker containers, otherwise you can easily run into conflicts as they would be competing for same hardware resources -
robothub-ctl stop. Keep in mind that sincewpa_supplicantis a subprocess of the RH agent, the WiFi connection will get killed along with the agent. To resolve this we recommend you manually setup the WiFi connection as done in this guide.
Generating docker image
docker pull luxonis/rae-ros-robot:humbleDownloading prebuilt images is recommended if you are not planning to considerably change source code.- Clone repository
git clone git@github.com:luxonis/rae-ros.git - Build docker image
cd rae && docker buildx build --platform arm64 --build-arg USE_RVIZ=0 --build-arg SIM=0 --build-arg ROS_DISTRO=humble --build-arg CORE_NUM=10 -f Dockerfile --squash -t <docker-image-name>:<tag> --load .You need to install necessary tools along buildxsudo docker run --rm --privileged multiarch/qemu-user-static --reset -p yes - Upload docker image to robot. Connect robot to your PC via USB so you can transfer image quicker. Note that currently space on the robot is limited, so you need to have 7-8 GB of free space in
/datadirectory -docker save <docker-image-name>:<tag> | ssh -C root@192.168.197.55 docker load - SSH into robot and run docker image or just skip all first 3 steps and run the second:
docker run -it --restart=unless-stopped -v /dev/:/dev/ -v /sys/:/sys/ --privileged --net=host <docker-image-name>:<tag>docker run -it -v /dev/:/dev/ -v /sys/:/sys/ --privileged -e DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix -v /dev/bus/usb:/dev/bus/usb --device-cgroup-rule='c 189:* rmw' --network host <docker-image-name>:<tag>
- Search for docker container name with
docker ps - Attach to the shell -
docker attach <container_name>, or if you want to create separate sessiondocker exec -it <container_name> zsh - To launch robot hardware -
ros2 launch rae_bringup robot.launch.py. This launches:- Motor drivers and differential controller
- Camera driver, currently set up to provide Depth and streams from left & right camera. Note here that you have to calibrate cameras (see steps below). Currently a default calibration file is loaded. It's located in
rae_camera/config/cal.json. To use one on the device or from other path, changei_external_calibration_pathparameter inrae_camera/config/camera.yaml - Depth image -> LaserScan conversion node used for SLAM
- Launching whole stack -
ros2 launch rae_bringup bringup.launch.py. It has following arguments used for enabling parts of the stack:enable_slam_toolbox(true)enable_rosbridge(false)enable_rtabmap(false)enable_nav(false) Example launch with an argument -ros2 launch rae_bringup bringup.launch.py enable_nav:=false
Developing on docker
MAKEFLAGS="-j1 -l1" colcon build --symlink-install --packages-select rae_hw). For related camera stuff you can build locally on PC due to hardware limitation of RAE. For quickly rebuild broken images is recommended to use docker.You can also test/develop/build on PC with connected RAE for rapid prototyping. This is recommended for rebuilding camera related stuff and testing, also need to comment from rae_camera.launch.py the following:Python
1RegisterEventHandler(
2 OnProcessStart(
3 target_action=perception,
4 on_start=[
5 TimerAction(
6 period=15.0,
7 actions=[reset_pwm, LogInfo(msg='Resetting PWM.'),],
8 )
9 ]
10 )
11 ),- Open rae-ros workspace in vscode
- Create
.devcontainerdirectory in the workspace - In it, create
devcontainer.json - You'll need to add Remote containers extension to VScode if you didn't already
- After that, a window should pop out that will ask if you want to reopen the directory in container, select yes If nothing pops up, CTRL+SHIFT+P , select option Rebuild and reopen
devcontainer.json
1// See https://aka.ms/vscode-remote/devcontainer.json for format details.
2{
3 "dockerFile": "../src/rae-ros/Dockerfile",
4 "build": {
5 "args": {
6 "USE_RVIZ": "1",
7 "SIM": "1",
8 "CORE_NUM": "10",
9 "--ssh": "default=$HOME/.ssh/id_rsa.pub ."
10 }
11 },
12 "remoteUser": "root",
13 "runArgs": [
14 "--device=/dev/ttyUSB0",
15 "--privileged",
16 "--network=host",
17 "--cap-add=SYS_PTRACE",
18 "--security-opt=seccomp:unconfined",
19 "--security-opt=apparmor:unconfined",
20 "--volume=/dev:/dev",
21 "--volume=/tmp/.X11-unix:/tmp/.X11-unix",
22 "--volume=${env:HOME}/.ssh:/${HOME}/.ssh",
23 // "--gpus=all"
24 ],
25 "containerEnv": {
26 "DISPLAY": "${localEnv:DISPLAY}",
27 "QT_X11_NO_MITSHM": "1",
28 // "LIBGL_ALWAYS_SOFTWARE": "1" // Needed for software rendering of opengl
29 },
30 // Set *default* container specific settings.json values on container create.
31 "settings": {
32 "terminal.integrated.profiles.linux": {
33 "zsh": {
34 "path": "zsh"
35 },
36 "bash": {
37 "path": "bash"
38 }
39 },
40 "terminal.integrated.defaultProfile.linux": "zsh"
41 },
42 "extensions": [
43 "dotjoshjohnson.xml",
44 "ms-azuretools.vscode-docker",
45 "ms-iot.vscode-ros",
46 "ms-python.python",
47 "ms-vscode.cpptools",
48 "redhat.vscode-yaml",
49 "smilerobotics.urdf",
50 "streetsidesoftware.code-spell-checker",
51 "twxs.cmake",
52 "yzhang.markdown-all-in-one",
53 "augustocdias.tasks-shell-input",
54 "eamodio.gitlens"
55 ]
56}xhost +local:docker.Calibration
Command Line
1robothub-ctl stopCommand Line
1git clone --branch rvc3_calibration https://github.com/luxonis/depthai.git
2cd depthai/
3python3 install_requirements.py
4# To calibrate RAE's front cameras - for back cameras we would change the board name to "RAE-D-E"
5python3 calibrate.py -s <size> -brd RAE-A-B-C -cd 1 -c 3- Try to fill stereo pairs matrices (color camera preview can be out of FOV, but not the Stereo pairs)
- Put charuco on a flat surface (bigger board it will be better)
- When you take a frame is a better practice to freeze board (motion are not ok)
Configuration:
Feature Tracker
depthai_ros_msgs/msg/TrackedFeatures messages. To enable features on, for example rgb node, set rgb: i_enable_feature_tracker: true. To enable publishing on rectified streams, set for example stereo: i_left_rect_enable_feature_trackerSetting Camera and IMU parameters
rae_camera/config/camera.yaml. For all available settings refer to depthai-rosRobot Localization
rae_hw/config/ekf.yaml . For more information refer to this linkSensors and sockets
Peripherals:
- LCD node - accepts BGR8 image (best if already resized to 160x80px) on /lcd Image topic
- LED node - Subscribes to /led topic, message type is LEDControl (refer to rae_msgs/msg/LEDControl)
- Mic node - Publishes audio_msgs/msg/Audio (from gst_bridge package) on /audio_in, configuration is S32_LE, 48kHz, 2 channel interleaved
- Speakers node - Subsides to audio_out to same type as Mic node, configuration is S16_LE, 41kHz, 2 channel interleaved
Testing motors
rae_hw/test you can find three scripts that will help you verify that the motors are running correctly. If you want to change arguments, you need to provide all of them.- To find out if encoder is working accurately, execute
ros2 run rae_hw test_encodersand rotate the wheel by 360 degrees. After rotation, encoder readout should be ~2PI. If not, adjust encoder tick per rev parameter. Script arguments -[encRatioL encRatioR]. Full arg versionros2 run rae_hw test_encoders 756 756 - Finding out max speed -
ros2 run rae_hw test_max_speed. Script arguments[duration encRatioL encRatioR]. Full arg versionros2 run rae_hw test_max_speed 1.0 756 756 - Motor verification -
ros2 run rae_hw test_motors. Script arguments[duration speedL speedR encRatioL encRatioR maxVelL maxVelR]. Full arg versionros2 run rae_hw test_motors 5.0 16.0 16.0 756 756 32 32
rae_description/urdf/rae_ros2_control.urdf.xacro file. Pin numbers shouldn't change between devices, but if that's the case you can edit that file to set new ones.- PWM pins (speed control):
Xml
1<param name="pwmL">2</param>
2<param name="pwmR">1</param>- Phase pins (direction control)
Xml
1<param name="phL">41</param>
2<param name="phR">45</param>- Encoder pins - each motor has A and B pins for encoders.
Xml
1<param name="enLA">42</param>
2<param name="enLB">43</param>
3<param name="enRA">46</param>
4<param name="enRB">47</param>- How many encoder tics are there per revolution - this might vary from setup to setup. To verify that, run the controller and rotate a wheel manually. You can see current positions/velocities by listening on
/joint_statestopic -ros2 topic echo /joint_states.
Xml
1<param name="encTicsPerRevL">756</param>
2<param name="encTicsPerRevR">756</param>- Max motor speed in rads/s
Xml
1<param name="maxVelL">32</param>
2<param name="maxVelR">32</param>- Both wheels have parameters for PID control set in that file, those values could need some tuning:
Xml
1<param name="closed_loopR">1</param>
2<param name="PID_P_R">0.2</param>
3<param name="PID_I_R">0.1</param>
4<param name="PID_D_R">0.0005</param>rae_hw/config/controller.yaml. wheel_separation and wheel_radius parameters might also need tuning depending on the setup.Implementation of motor control is found in rae_hw package. You can set the motor to be ready to receive twist commands on /cmd_vel topic by running:ros2 launch rae_hw control.launch.pyYou can then control the robot via keyboard teleop from your pc via (assuming you are connected to same network robot is in):sudo apt-get install ros-humble-teleop-twist-keyboardros2 run teleop_twist_keyboard teleop_twist_keyboard
LED node
- Control all (set control_type to 0) gives all LEDs the same color
- Control single (control_type to 1) lets you control just a single LED light by setting single_led_n variable
- Custom control (control_type to 2) where you send a list that defines every value of LED lights at once.
Python
1for i in range(40):
2 led_msg.single_led_n = 0
3 led_msg.control_type = 2
4 if i < 8:
5 color = "white"
6 led_msg.data[i]=(colors[color])
7 if i >9 and i < 14 and angular_speed > 0.0 and blinking==True:
8 color = "yellow"
9 led_msg.data[i]=(colors[color])
10 if i > 20 and i < 29 and linear_speed < 0.0:
11 color = "red"
12 led_msg.data[i]=(colors[color])
13 if i> 34 and i < 39 and angular_speed < 0.0 and blinking==True:
14 color = "yellow"
15 led_msg.data[i]=(colors[color])Command Line
1ros2 launch rae_hw peripherals.launch.py
2ros2 launch rae_hw control.launch.py
3ros2 launch rae_bringup robot.launch.pyLCD node
img_msg = self.bridge.cv2_to_imgmsg(img_cv, encoding="bgr8")Microphone and speakers
Python
1if msg.encoding == "S32LE":
2 audio_data = np.frombuffer(msg.data, dtype=np.int32)
3 elif msg.encoding == "S16LE":
4 audio_data = np.frombuffer(msg.data, dtype=np.int16)
5 if msg.layout == Audio.LAYOUT_INTERLEAVED:
6 # Deinterleave channels
7 audio_data = audio_data.reshape((msg.frames, msg.channels))gst-launch-1.0 --gst-plugin-path=install/gst_bridge/lib/gst_bridge/ filesrc location=sample.mp3 ! decodebin ! audioconvert ! rosaudiosink ros-topic="/audio_out"gst-launch-1.0 --gst-plugin-path=install/gst_bridge/lib/gst_bridge/ rosaudiosrc ros-topic="audio_out" ! audioconvert ! wavenc ! filesink location=mic1.wavgst-launch-1.0 --gst-plugin-path=install/gst_bridge/lib/gst_bridge/ rosimagesrc ros-topic="/rae/right_front/image_raw" ! videoconvert ! videoscale ! video/x-raw,width=160,height=80 ! fbdevsinkgst-launch-1.0 alsasrc device="hw:0,1" ! audio/x-raw,rate=48000,format=S32LE ! audioconvert ! spectrascope ! videoconvert ! video/x-raw,width=160,height=80 ! fbdevsink
USB ports
Command Line
1gpioset gpiochip0 44=1
2echo host > /sys/kernel/debug/usb/34000000.dwc3/modeCommand Line
1gpioset gpiochip0 44=0
2echo device > /sys/kernel/debug/usb/34000000.dwc3/mode