Project Management

Schedules

Gantt Chart

Gantt charts for spring and fall semester showing schedules for work packages

Presenters

Sensors and Motors Control Lab: Dan Bronstein

Progress Review 1: Ben Kolligs

Progress Review 2: Kelvin Kang

Progress Review 3: Jacqueline Liao

Progress Review 4: Dan Bronstein


Test Plans

Spring Validation

Graphical depiction of spring validation setup. Each number corresponds to the same step in “Procedure” listed below.

Goal:

Showcase our complete hardware along with the basic functionalities of our control, situation awareness, and human detection subsystems in a well-lit and smokeless room.

Location:

Beeler Street, Pittsburgh, PA

Environment:

A well-lit room no smaller than 10m2 with no smoke in the room.

Equipment:

Dragoon, visualizer, WiFi network, indoor room, room geometry/obstacle proxies(cardboard boxes, furniture, trash bins etc.), human body, external light source

Demonstrations:

  1. Power and Startup
    • Procedure: Operator turns on robot and visualizer, then sets robot at the entrance to the room
    • Validation:
      • The robot connects to the remote control
      • Jetson AGX turns on and connects to RealSense and Seek 
      • RealSense depth and RGB, Seek Thermal streams operational and stream to AGX
      • VLP-16 starts scanning, Cartographer begins mapping using VLP-16 and RealSense IMU
      • Visualizer turns on and begins to display 2D map and RGB stream at 10Hz (M.P.4)
  2. Locomotion
    • Procedure: Operator moves robot forwards, backwards, using remote control at a max speed of 0.5 m/s and can turn about its location
    • Validation:
      • Robot moves forwards, backwards, at a min of 0.5 m/s and turns in place at a speed of 18 degrees per second (M.P.5)
  3. SLAM
    • Procedure: Operator moves robot at a max speed of 0.2 m/s through the obstacles in the room for at least 5 minutes using the 2D map and RGB stream displayed on the visualizer
    • Validation:
      • 2D map of room for up to 10m from the robot is displayed and updated on visualizer; the shape of major obstacles are captured in the map (M.P.3)
      • The position of the robot is displayed on a 2D map on the visualizer
  4. Basic Human Detection
    • Procedure: Robot detects and localizes human
    • Validation:
      • The robot detects and localizes human 8m away within 5 seconds of complete, unobstructed entrance into the RGB and Seek FOV with 75% accuracy (M.P.0, M.P.1)
      • The location of the human is displayed on a 2D map on the visualizer. Check that the centroidal accuracy of the bounding box is within 1 foot relative to the mapped room geometry


Fall Validation

Goal:

Demonstrate that a user can control Dragoon via teleoperation and without line-of-sight into a dimly lit and obfuscated room and be able to determine the presence and location of a human, which is displayed real-time on a 2D map. Additionally, the demonstration will test the system’s E-stop and recording of mission and data logs.

Location:

Beeler Street, Pittsburgh, PA

Environment:

A dimly-lit room no smaller than 10m2 with some smoke obfuscation produced by a smoke machine.

Equipment:

Dragoon, visualizer, WiFi network, indoor room, room geometry/obstacle proxies(cardboard boxes, furniture, trash bins etc.), human body, smoke machine

Demonstrations:

  1. Power and Startup
    • Procedure: Operator turns on robot and visualizer, then sets robot at the entrance to the room
    • Validation:
      • Display of watchdog module on visualizer indicates the health of batteries and sensors (M.N.5)
  2. Locomotion
    • Procedure: Operator moves robot for a few seconds before hitting E-stop
    • Validation:
      • Robot immediately stops all motion when E-stop is hit (M.N.4)
  3. SLAM
    • Procedure: Operator moves robot through the obstacles in the room for at least 5 minutes using the 2D map and RGB stream displayed on the visualizer
    • Validation:
      • RGB stream and 2D map of dim, smoky room for up to 10m from the robot is displayed and updated on visualizer; the shape of major obstacles are captured in the map (M.P.2, M.P.3, M.N.0)
      • 2D map is displayed and visualized in real-time (M.P.3)
  4. Human Detection
    • Procedure: Robot detects and localizes human
    • Validation:
      • The robot detects and localizes human who is at least 30% occluded and 3m away (M.P.1, M.P.2, M.N.0)
      • Human is detected in real-time
      • Location of human displayed on 2D map on visualizer in real-time
  5. Mission Logging
    • Procedure: At the end of the demonstration, the user will download available data logs 
    • Validation:
      • All available data logs for the last 20 minutes are downloaded and accurate (M.P.7)

Fall Semester Goals

  1. September
    • Milestones:
      • Defog RealSense and LiDAR
      • Improve performance of YOLO on Seek
    • Demonstrations:
      • RealSense can detect humans under obfuscation
      • LiDAR not substantially degraded in smoke
      • Improved accuracy of Seek YOLO human detection
  2. October
    • Milestones:
      • Tune SLAM parameters
      • Integration of RGB and thermal human detection
      • Implement control schemes (Desired)
    • Demonstrations:
      • Improved mapping and localization of SLAM system
      • Fused human detections from RGB and thermal sensors
      • Robot platform can be controlled without remote control (Desired)
  3. November
    • Milestones:
      • Full system integration
    • Demonstrations:
      • All subsystems integrated onto platform and function as expected
  4. December
    • Milestones:
      • FVD
    • Demonstrations:
      • Dragoon rises!

Parts List

HOWDE Parts List (external link)


Issues Log

HOWDE Issues Log (external link)