Implementation

Full System Development

Currently, we have finished all hardware development on the system and have implemented Dragoon’s basic software functionalities, including mapping and localization and human detection and localization using a RGB sensor and a depth sensor. Our system currently performs as expected in bright and clear conditions. Our goal for next semester is to enhance the capabilities of Dragoon so that it is able to perform its tasks in dark and smoky conditions, as expected in post-disaster scenarios.

Dragoon’s first outing after completion of hardware development

Dragoon’s first outing (external video)

Visualization of software stack (external video)


Subsystem Development

Robot Base

As planned, the spring semester was very heavy on hardware development. We ordered and received our SuperDroid robot base, which came with a remote controller. We then proceeded to design the layout of the robot and fabricate, either using the 3D printer or the machine shop, the parts needed to put all our parts on the robot. Prior to fabrication, we did load analysis on all our mounts to ensure that they will be sturdy. Alongside the fabrication of parts was the development of the power distribution board of the robot. Each iteration of the board was designed, received from the manufacturer, and then put together using the tools and the reflow oven in the lab.

Simulation of full hardware stack in Rviz (external video)

Complete hardware stack

CAD model of complete hardware stack

RealSense D435i and Seek Thermal sensors on custom-made sensor mount and casing

Version 2 of PDB

Custom-made LiDAR mount, RealSense/Seek mount, and AGX mount on robot base

3D printing fabrication of battery holder

Load analysis of LiDAR mount

Version 1 of PDB

Sensing

Our sensing subsystem consists of 3 major sensors: the RealSense RGB sensor, the RealSense depth sensor, and the Seek Thermal sensor. All sensors have been calibrated to get their intrinsic parameters, and the extrinsic parameters between the two RealSense sensors have also been obtained. For SVD, we used the registration between the two RealSense sensors in order to detect humans, and the plan for FVD is to register both the RealSense RGB and the Seek Thermal to the RealSense depth in order to extract the location of the human in the frame.

RealSense RGB and Depth and Seek Thermal run on robot’s AGX (external video)

Registration of points between the RealSense RGB sensor and the RealSense depth sensor

Calibration of RealSense

Result of calibration of Seek Thermal sensor

Software

Our software pipeline consists of 3 major components: SLAM, human detection, and defogging. We are using our Velodyne VLP-16 and the onboard RealSense IMU as inputs to the Cartographer SLAM system, which we have implemented on Dragoon. We are using YOLO to detect humans using both the RealSense RGB and the Seek Thermal; calibration done between the RealSense RGB and the RealSense depth streams allow us to take the location of the bounding box from the RGB image to the depth image and extract the depth of the human, giving us a 3D pose(s) of the human(s) in the frame. The global poses of humans on the visualizer is then determined by a Kalman filter, which receives information on each human detection seen on each individual frame.

Development on the defogging pipeline as well as the registration between the Seek Thermal and RGB depth will take place in the fall.

SLAM system working on robot base (external video)

Overview of Cartographer SLAM system

YOLO detects a human in an image returned by Seek Thermal sensor

Vehicle UI

Vehicle UI architecture

Current visualizer

As seen above, our Vehicle UI displays the Cartographer map (with the location(s) of human(s) drawn on it) as well as data from our sensors through Rviz onto a screen in front of the operator. In the top left corner of the visualizer, the operator can see a watchdog module that indicates the heartbeats of the different sensors. On the top middle part of the screen, the operator is able to see if any humans have been detected and the locations of the humans detected. Red dots on the map indicate “evidences” of human detection as seen on each individual RGB frame, while green dots on the map are the positions of the humans as determined by the Kalman filter using the evidences. The operator can also see the path taken by the robot on the screen. Finally, on the right side, the operator can see the current Seek Thermal, RealSense RGB (with YOLO running on it), and RealSense depth streams.