Subsystem descriptions
Sensing
The system has a variety of sensors for perception and flight info. A camera and Velodyne Puck Hi-Res will compose the sensor payload. The DJI M600 has onboard GPS and IMU sensors. Preliminary project work will ensure the sensors parameters are correctly setup, and that each subsystem receives the sensor data it requires.
Power Systems
The Matrice 600 has onboard batteries which power both the rotors and integrated GPS and IMU. However, all other sensors and electronics need seperate power sources to operate.
In order to power the larger sensors and onboard computer, additional protection and regulators are needed. A PCB board takes in either 7.4v or 18v input and converts the power to 12v and 16v respectively. These circuits also include overvoltage and reverse voltage protection. The input to the PCB can either be from a LiPo battery or from the M600’s XT30 port.
I/O hardware
The Input/Output hardware connects all the sensors to the computer, and from the computer to the DJI drone. USB-TTL Cable connects Matrice Lightbridge to Jetson Xavier, transfering either motor commands from the Xavier to the Matrice 600 or data from IMU/GPS. Data from the Intel Realsense is supplied through USB-C/USB3.0 from Jetson Xavier. The Lidar communicates with the Jetson Xavier over ethernet cable.
Perception
The purpose of the perception system is to process the sensor data and get the whole picture of the surrounding objects and environment. Currently, there are two core functionalities in the perception system.
Camera & Deep Learning
The first is the object tracker node. The node takes in RGB data from an Intel Realsense camera, performs object detection and classification on each frame with a custom trained YOLO v3 network, and finally publishes object tracking information.
Lidar
The bounding box information of the tracked objects would then get passed into the other core part of our perception system, the lidar processing node. The lidar processing node takes in the primary sensor input from a Velodyne VLP-16 lidar. The Realsense camera and the lidar were calibrated to be aligned in the same local frame. The lidar processing node would then use the intrinsics matrix of the camera to project the point cloud to the camera image plane. That operation allows the system to obtain point cloud points in the object bounding boxes. An additional cluster filtering operation is applied to these point clouds to filter out background objects and noises. We then take the average of the world coordinates of the points in these point clouds to localize the positions of the objects being tracked.
Flight Platform
Since there is a flight requirement, the system must have a flight platform. The Flight Platform subsystem is responsible for all aspects of flight and mobility for this project. It includes the UAV, drone power supply, and flight controller. The team will use a DJI M600 for the UAV. Dr. George Kantor offered the use of his M600 in exchange for functionality testing and instructing him and his team in flight operation. The M600 also comes equipped with DJI’s SDK to enable ROS integration, high-level path planning, and several other built-in functionalities.
The main focus of work is to develop a flight computer that converts the local path created in the planning subsystem into commands the DJI M600 can use to make the required movements. This functionality leverages the built-in functionalities of the DJI drone to fly along a commanded path. The flight subsystems are composed of 3 parts, flight computer, DJI microprocessor, and propeller and motor parts. The flight computer will take data from the local planner and output waypoints that the DJI microprocessor understands. The DJI SDK will output the necessary torques and rpms to the motors.
If this flight system fails a different system can be built leveraging other DJI drones to carry the sensor package. This could lead to reducing the sensor system to just a Lidar so that the sensor pod can be carried by smaller drones. Additional weight reduction requires performing the computation on a base station instead of on the drone; thus, removing the need for extra batteries and computer, and overall reducing the weight.
Planning
The planning subsystem receives information from the Sensing and Perception subsystems and produces flight commands to the DJI M60. Planning includes both a global planner and a local planner. The global planner will compute the optimal UAV behavior for an obstacle-free environment. The local planner receives the optimal path from the global planner and predicted obstacle trajectories from the perception subsystem. If a globally planned trajectory is deemed too dangerous, the local planner produces an adjusted flight trajectory to maintain a safe distance from each obstacle.
Current implementation of the local planner uses a 2D potential field approach for both the global and local planners. When the drone is not near an obstacle, the global planning field is an attraction field towards the next waypoint. As an obstacle intrudes on the drone’s safety radius, the potential field switches states to an orbit field that guides the drone around the obstacle.
References
[2] Cho, Hyunggi & Seo, Young-Woo & Kumar, B. & Rajkumar, Ragunathan. (2014). A multi- sensor fusion system for moving object detection and tracking in urban driving environments. 10.1109/ICRA.2014.6907100.
[3] Geyer, Christopher & Singh, Sanjiv & Chamberlain, Lyle. (2011). Avoiding Collisions Between Aircraft: State of the Art and Requirements for UAVs operating in Civilian Airspace.