System Summary

Problem Description

Exploration and navigation in unknown environments is a challenging task for ground rovers. While AGVs are often equipped with a variety of sensors that can detect the surrounding environment, the information that these sensors can provide are limited to the immediate surrounding of the AGVs and do not provide sufficient information about the obstacles and eventual destination of different paths. In these cases, an AGV without additional information would have to proceed down each path, prioritized by the best heuristic, in the hopes that no obstacles or dead end block the path.

In time-crucial environments such as disaster response and relief, this trial-and-error approach would not suffice. Ground operators need to know which paths are safe and absent of hazards to deliver the required aid to trapped survivors as soon as possible. To enhance the exploration and navigation capabilities of autonomous ground vehicles, we propose FalconEye – a  heterogeneous mapping solution that combines the sensor input from UAV’s to detect the open routes and obstacles that AGV’s sensors cannot. Using airborne HD cameras and ground LiDAR sensors, FalconEye creates and operates within a 3D map whose range far exceeds the map created by ground-only robotic systems.

Use Case

In 2020, Pittsburgh is ravaged by a magnitude 3.0 earthquake and suffers heavy infrastructural damage to the buildings, pipelines, and electrical grid. Crumbled buildings, displaced streets and the resultant debris leave many locations inaccessible from the ground. Some of the buildings remain standing but rescue personnel are uncertain about the damage sustained by the infrastructure and the location of any safe access points. Satellite images lack the fidelity and the angle to show the areas rendered in accessible due to obstacles. Many people are trapped amidst the debris, time is of the essence, and disaster relief teams urgently need more information about the landscape of the disaster zones.
The city of Pittsburgh dispatches Captain Dolan with FalconEye to explore and navigate the dispatches FalconEye, marking the center of the disaster zone as the target destination. The UAVs take off and fly ahead of the AGV, gathering aerial information about the environment and stitching a 2D map. As the aerial map is formed, the UAV transmits this data to the AGV’s, where the AGV localizes itself within this map and autonomously plans and follows a path forward that avoids the detected obstacles. By using both aerial and ground data, the AGV’s are able to avoid significantly more ground obstructions and arrive at the target location faster than what would’ve been possible with just the AGV’s.

 

Note: To simplify the problem, we will be using fiducial markers to represent valid nodes in the traversable path as well as the AGV. Below is a pictorial representation of how the system will behave with/without drone.