Documents

Design Reviews

Conceptual Design ReviewReportPresentation
Preliminary Design ReviewReportPresentation
Critical Design ReviewReportPresentation
System Development ReviewReportPresentation
Final ReportReportPresentation

Individual Lab Reports

Kangni LiuGuangzhao LiTing-Wei WongSiddharth Saha
ILR01ILR01ILR01ILR01
ILR02ILR02ILR02ILR02
ILR03ILR03ILR03ILR03
ILR04ILR04ILR04ILR04
ILR05ILR05ILR05ILR05
ILR06ILR06ILR06ILR06
ILR07ILR07ILR07ILR07
ILR08ILR08ILR08ILR08
ILR09ILR09ILR09ILR09
ILR10ILR10ILR10ILR10

Standards and Regulations

Presentation

Design Brainstorming

This is the draft for the Unitree network configuration for wireless image transmission. The UDP package flows to switch, then Raspberry Pi, and finally the desktop.

Figure 1. Unitree network configuration design draft

Here is a draft we wrote for system design.

Figure 2: System Design Brainstorm

CAD Model

The CAD model of our team is on the page of [System Implementation/Hardware Design and Integration].

Power Distribution Board: Add-on Gas Detection 

Per PCB design, we developed a breakout board to power a gas detector, LED lights, and a buzzer to serve as a warning device if dangerous gas is detected above a safe threshold. The schematic is shown in Figure XXX below. This Arduino Nano will be powered by the robot payload’s USB port (usage) or by a laptop (demo) via a 5V USB-to-micro-USB cable. The Arduino board will serve as the processing unit to operate the MQ-135 gas detector, as well as buzzers for sound alarms and LEDs for visual warnings.

Figure 2. PCB schematic

The basic component BOM is shown in the Table below.

Item No.NameDescriptionQuantityDesignator PartManufacturer Part NumberVendor for purchaseCost / part [$]
1Arduino-Nanomicro circuit board1ABX00053ABX00053Digikey30.6
2TLLG4400Green LED1LED1TLLG4400Digikey0.6
3TLLR4400Red LED1LED2TLLR4400Digikey0.51
4R1, R2Resistor = 1600 ohm2R1, R2MBB02070C1601FCT00Digikey0.29
5BUZZER MAGNETICBuzzer, 5V, 30mA1LS1AI-1224-TWT-5V-RDigikey1.05
6Pin Header1×4; 2.54mm1SV1PPTC041LFBN-RCDigikey0.45
7MQ-135Gas detector1SV1‎430578031 from ‎XimimarkAmazon8.99
Table 1. Bill of Material of PCB components.

The CAD layout is shown in Figure 2 below.

Figure 3. PCB breakout board layout

Figure 4. PCB layout designed by Guangzhao
Figure 5: PCB board
Figure 6: PCB board
Figure 7. CO2 Detector

Drawings, schematics, and datasheets

Electrical: PCB Datasheet

Datasheet1

Datasheet2

Datasheet3

Datasheet4

Datasheet5

Datasheet6

Datasheet7

Software Architecture

Here is the software follow for the legged control system.

Component testing & experiment results

Human Detection

Figure 8. Human detection system testing results
Figure 9. Ground truth and Yolov8 prediction

Previously, we tested our system with seniors rgb dataset, but it doesn’t fit our use case because of the low light condition. To further test the human detection system, our team built up a 500 image dataset. We took a video of humans staying 5 meters and 3 meters away from the robot. Making different postures and hiding behind chairs and wood boards. As you can see in the live test, YOLO is very powerful. So we skipped the images that are easy to detect as well as the repeated ones. We handpicked frames that look challenging for the test and close to real disasterous scenarios. 

Our system reaches 80% precision with the 200 images taken 3 meters away. And it reaches 70% precision with the 300 images taken 5 meters away.

Figure 10. Testing Results from YOLOv8

Payload 3D LiDAR SLAM

Figure 11. SQH office


In Figure 11, the 3D pointcloud and motion trajectories are displayed for the tests at AirLab’s Squirrel Hill office and NSH B-level. For the test performed in AirLab, the sensor payload was mounted onto a moving cart, hence producing a cleaner pointcloud upon inspection. For the tests performed at NSH B-level, we increased the difficulty by carrying the payload on the shoulder when recording maps, which mimics the walking gait the payload would experience on a quadruped robot. As shown in the result, the pointcloud becomes more blurry without clear-cut geometries upon inspection, given that larger and more challenging B-level geometry does play an role in the mapping process.

Per our targeted performance requirements for Spring Validation Demonstration and Encore, the payload 3D LiDAR SLAM subsystem was not aim to achieve a specific quantified metric by the end of Spring 2023 semester given the 2-month delayed delivery. The system did shown promising preliminary results to support our end goal in exploration and human localization by producing reasonably accurate trajectories and 3D maps. For future developments, the performance requirement of 3D SLAM subsystem will be met in the first quarter of Fall 2023 semester during progress reviews. The test will be evaluating the absolute Euclidean distance error between start and end pose given they are the same in ground truth.

Software

All code has been developed under version control using GitHub. The software stack is in large part a ROS2 workspace written predominantly in C++. Python is also used for the human detection function. The team also build up docker container and worked on environment agnostic code.

Here is an example our code, which is used for human detection.

Here is another piece of Arduino code, used for CO2 detector.