15.7 C
London
Saturday, September 21, 2024

Ground Robot Design: Hands-On Experience with Sensor Payload Integration

Here is the rewritten article in HTML:

Introduction

The SubT challenge requires a team to design and build a ground robot that can navigate through a challenging environment and detect specific artifacts. In this article, we will explore the design and development of our ground robot, focusing on the sensor payload and its components. Our robot is equipped with a range of sensors, including RGBD cameras, thermal cameras, CO gas sensors, and more. We will discuss the selection of these sensors, the payload build process, and the challenges we faced during development.

Hands on Ground Robot Design

This post is part of the Hands on Ground Robot Design series. See here for the full list of posts/topics for information on mechanical, electrical, motion control, computing, sensors, and more.

Sensor Selection

Based on the system requirements, we determined that we needed RGBD cameras in all four directions to help find required artifacts (survivors, fire extinguishers, cell phones, rope, backpacks, gas, etc…) in all directions. We also wanted thermal sensing for detecting humanoid heat signatures and CO gas sensing for detecting artifacts.

The reason for wanting the depth (D) in addition to RGB was for determining the location of artifacts when near the robot. If the object was far from the robot, we primarily used the Velodyne to determine its position relative to the robot, and then the artifacts position in the global map.

For localization and mapping (SLAM) we needed a Velodyne Puck and an IMU sensor. In order to process all of that data, we decided to use two separate computers. An 8th gen Intel NUC i7 for SLAM and an NVIDIA Xavier for object detection from the cameras.

Other auxiliary hardware was needed to make this payload work. For example, LED’s, LuxDrive LED Driver, USB hubs, an ethernet switch, time synchronization hardware and two Vicor DC/DC converters for power conversion/filtering.

The list of sensors selected were:

  • Velodyne (VLP-16) Puck
  • Xsens MTi-200-VRU-2A8G4 IMU
  • Intel RealSense D435 (x4)
  • Teledyne FLIR-Boson 640 Thermal Cam (x2)
  • UEye UI-3241LE-M/C RGB Cam (x2)
  • CozIR®-LP3 1% CO2 Sensor
  • Seeed Studio ReSpeaker 107990053 Microphone Array
  • Intel Dual Band Wireless-Ac 8265 W/Bt WiFi

Payload Build

Starting with the first concept image below yielded the final payload shown below. For SLAM reasons, we made sure that the IMU was mounted close and rigidly to the Velodyne. We also quickly added a roll cage after some early robot flips during testing to help protect the payload.


Image above shows the underside of the top plate during the wiring process. The Velodyne is below this plate on the table. In this image you can see the orange IMU, 2 black USB hubs, and the PCB Ethernet switch.

SubT sensor payload insides
Image during the wiring process showing some of the internal components. The 2 black thermal cameras can be seen protruding from the 45degree sections at the top sides of the payload. Near the bottom center of the image is an Intel NUC computer, behind that is a green circuit board with the DC/DC converters. Between the 2 thermal cameras are connectors for the Velodyne and power switches for the two computers. The Xavier computer has the “No Thermal Pad Applied Yet” label with a fan mounted to its rear. If you look in the top right corner of the image you can see a side wall a fan and RealSense camera waiting to be mounted.

SubT sensor payload mostly assembled.
Image above shows the payload nearing completion with all of the sides connected besides for the rear wall. The Intel NUC computer is clearly visible behind where the rear plate mounts.

SubT sensor payload complete with roll cage
In the above final build image you can see the black RGBD Intel RealSense cameras. You can also see the LED’s mounted above the Realsense cameras. The cameras on the 45degree angled aluminum piece are thermal cameras. In the black section below the Velodyne was the microphone array that was largely not used.

Thermal Design / Issues

For thermal reasons, we had four fans in the payload. The NUC had its blower attached, and we added a fan behind the Xavier. Both of those fans circulated air within the payload. We then had a “push” fan and a “pull” fan mounted at the two sides of the payload to have a constant flow of air flowing through the payload. We had several times during the project where the Xavier kept restarting, and we suspected thermal issues. In one of the cases the fan mounted inside near the Xavier failed. In the other cases, we found that if the Xavier operates at full memory utilization, it can sometimes restart, and the error messages in the log file indicated thermal issues.

Camera Issues

USB Issues

The RealSense cameras had all sorts of issues with not being detected. Often on boot, we would have a camera not be detected, then we would reboot and the camera was detected, but another camera was not detected. We ended up figuring out that an older firmware version of the cameras seemed to be more reliable. Using this older firmware, most of the time (but not always) all of the RealSense cameras would be detected on the initial booting of the robot/computer.

Dust Clouding all Images

Another issue that we fought was the quality of the RealSense camera images when operating in dust and fog while using the LED’s on the payload. When the LED’s were turned on (which was needed in the underground dark environment), the camera images had large specular reflections from the dust particles making the images almost not usable. We eventually stopped using the lights on the payload that were near the cameras. Using lights mounted to the robot that were angled up and off-axis from the cameras produced much better images.

Next Steps

Now that the robot is built, the next step is lots, and lots of testing.

Conclusion

In this article, we explored the design and development of our ground robot for the SubT challenge, focusing on the sensor payload and its components. We discussed the selection of sensors, the payload build process, and the challenges we faced during development. We hope that this article has been informative and helpful for those interested in designing and building their own robots for the SubT challenge.

Frequently Asked Questions

Question 1: What sensors did you use for the SubT challenge?

We used a range of sensors, including RGBD cameras, thermal cameras, CO gas sensors, and more.

Question 2: How did you select the sensors for your payload?

We selected the sensors based on the system requirements and the environment in which the robot would be operating.

Question 3: What was the biggest challenge you faced during development?

The biggest challenge we faced was the quality of the RealSense camera images when operating in dust and fog.

Question 4: How did you overcome the thermal issues with the Xavier computer?

We used four fans in the payload to circulate air and keep the Xavier computer cool.

Question 5: What is the next step for your robot?

The next step is lots, and lots of testing.

Latest news
Related news