Project Management

Schedule

Schedule for Spring Semester

Gantt Chart for Project Schedule

Date Milestones for Spring Semester
1/20 Autoware Environment Setup
2/15 Website Deployed 
2/19 Simulation Platform Setup on CARLA-Autoware 
3/4 Navigation subsystem design & Simulation setup for docking ready
3/25 Navigation subsystem components tested
4/8 1. Navigation subsystem developed

2. Docking subsystem components tested in in Simulation 

3. Safety subsystem components Object Detection and Behaviour Prediction developed

4/15 Health Monitoring Design Ready, Navigation Subsystem and Docking subsystem Software Integration done.
4/22 SVD

Schedule for Fall Semester

Gantt Chart for Project Schedule

Date Fall Milestone
9/1 Gap Analysis, Navigation Subsystem
9/16 Docking subsystem designs completed
09/30 Vehicle models ready, Simulation merged
10/14 Docking system completed and tested in simulation.

sim2real test setup ready

10/28 Docking, Navigation and Safety Subsystem integrated in simulation, 
11/11 All subsystems finished and unit tested, sim2real test setup ready
11/18 FVD


Presentation Order for Progress Reviews:

For spring and fall semester: Rohan -> Sanil -> Poorva -> Uma -> Sachit


Test Plan

Detailed Experiment for the full system validation can be found here

Spring Validation Demonstration

Note: Tests will be performed in Simulation

Equipment: Laptop with Autoware and Gazebo, and a precomputed map of the Gazebo world.

Validation Experiment Description

Procedure

Validation Criteria

    1. Navigation

The system will navigate from one end of the designated area in simulation to a distance of 2m from the payload handling zone entry point, without any obstacles in the way.

    • Use Autoware to ask the vehicle to go to PHZ coordinates in the Gazebo world.
    • Check logs to see if PHZ reached.
    • Use AprilTag detection and Pod leg detection to locate Pod and refine PHZ
  1. Path to the destination point is planned and executed (M.P.2)
  2. After reaching the 2m proximity of the point the vehicle identities the payload handling zone entry point usinng localization data. (M.P.4)
    1. Safety

The system will stop at a distance of more than 30 cm from the detected obstacle (dynamic and static). The minimum dimensions of said obstacles will be 30×30 cm 2 (height and width).

System to demonstrate its safety behaviors if some node crashes.

    • Vehicle will be asked to go from point A to point B in the simulated Gazebo world.
    • Obstacles will be placed in the path while Vehicle is traversing it.
    • A sensor node will be crashed manually and response will be noted.
  1. Failures will be diagnosed within 2000 ms from occurrence and reported on dashboard (M.P.8)
  2. Vehicle will detect object location with an accuracy of ± 5 cm (M.P.9)
  3. Vehicle will maintain a minimum distance of 30 cm from obstacles (M.P.10)
    1. Alignment and Docking

Once it has reached the payload handling zone, the system will plan a path for aligning with the payload handling zone and execute it. The system will verify the accuracy of the docking (and relay the output to the user). Done in Gazebo Simulation.

    • Place the chassis in front of the Pod and give docking command with pod id
    • The chassis will identify the pod and get a relative pose.
    • The chassis will execute the docking command, align with the pod and reach directly under the pod.
    1. The chassis will align with the pod and have an error margin of ±5 cm in (x,y) and 5.54 degree in theta from the center (M.P.6)
    1. Retracing

We will introduce a slight misalignment between the pod and chassis manually and check if the system detects this and re-performs the docking alignment process.

    • Introduce misalignment in pods orientation docking procedure.
    • The system should recognize this misalignment and retrace the path and follow the corrected path to align correctly and dock.
  1. Docking will be done within 120s (partially fulfilled). (M.P.5)
  2. The chassis will align with the pod and have an error margin of ±5 cm (M.P.6)

Fall Validation Demonstration

Objective: To validate the subsystems and meet mandatory performance requirements

Location and Conditions: NSH Basement (B level) and Simulation World

Equipment: Sensor jig, a pod with fiducial markers, laptop with Autoware and dependencies, and a precomputed map of the testing environment

Procedure

Simulation

  1. Launch the simulation, place the chassis in the pre-mapped environment, and instruct it to go to a pod location to demonstrate point-to-point navigation.
  2. Place an obstacle in the planned path. Verify that the Health Monitoring System (HMS) transmits an error message to the system and the chassis stops as a result. Remove the obstacle and verify that the chassis continues on its planned path.
  3. After the chassis reaches the Payload Handling Zone (PHZ), verify that the PHZ and pod are identified.
  4. Verify that the docking command has been issued by the state machine.
  5. Issue a command to the chassis to transport the pod to the drop-off location and then undock with it.
  6. Sensor nodes will be crashed to observe system behavior.
  7. Repeat the Approach Navigation step multiple times. For one trial introduce misalignment after the chassis achieves docking position to demonstrate retracing.
  8. Monitor the system states and logs during the whole process.

Hardware

  1. Test 1: Setup the hardware jig and make a pedestrian walk in front of it. Demonstrate pedestrian detection and trajectory prediction and compare to ground-truth positions.
  2. Test 2: Place the jig in front of the pod and demonstrate relative localization by localizing the pod and comparing it against ground truth values. Demonstrate the docking verification check by placing the jig under the pod and comparing the predicted docking pose error with the actual (ground-truth) pose error.

Validation Criteria

  1. Simulation: The chassis will plan a path to the desired pod, identify it, and dock with it. It will then drop the pod off at a predefined undocking location. State transitions happen only when the last action has been completed. The logs will show the system health, states, and results in an easily readable format. The metrics that will be met are:
    1. Docking will be done within 120s. (M.P.2).
    2. The simulation should be successful 85% of the time. The success criterion subsumes  M.P.1 (chassis will identify pod when it is within 2m of the PHZ), M.P.3 (chassis will achieve docking pose with an error margin of  ±5 cm in position and ±5.54 degrees in orientation), and M.P.8 (chassis will identify PHZ 90% of the time).
  2. Hardware: Validate algorithms with real-world data and show that the same accuracy criteria are met. The metrics that will be met are: 
    1. Test 1: Object locations are detected with an accuracy of ± 5 cm (M.P.6).
    2. Test 2: Predicted pod position (x,y) for relative localization is within 20 cm of the ground-truth, and the orientation (yaw) is within ±11.80 degrees of the ground-truth. The same margins hold for the predicted docking pose (x,y, yaw) error and actual error. 

Updated Risks

Risk ID Risk Requirement Type L C Mitigation
R1 Issues with legacy system All MPs Technical

Schedule

5 5 – Develop plan to test chassis

– Continue developing and testing algorithms in simulation                                

R2 Unable to assemble chassis All MPs Technical

Cost

Schedule

Programmatic

5 5 – Develop plan to assemble chassis     

– Test subsystems in simulation

– Move FVD to simulation

R3 Sensors stop working MPs 2, 3, 4, 7, 8, 9 Technical

Cost

2 3 – Formulate sensor test plans and safeguarding procedures

– Use hardware abstraction layer

R4 Testing infeasible due to weather conditions All MPs Technical

Schedule

3 3 – Test subsystems in NSH B-level
R5 Communication gap with sponsor All MPs Schedule 4 4 – Document work

– Provide regular updates

R6 Issues in Subsystem Integration MPs 5, 6, 8 Technical

Schedule

4 4 – Perform testing on individual subsystems

– Develop integration test plan initially

R7 Overambitious requirements MPs 2, 3, 4 Technical

Schedule

3 3 – Refine requirements and descope system if necessary
R8 Integration issues

with code

All MPs Technical

Schedule

2 3 – Document work with changes implemented

– Perform unit testing at each level possible

– Use a common framework to develop code

R9 Global pandemic occurs, causing campus shutdown All MPs Programmatic 1 3 – Continue working simulation

– Follow government and university guidelines regarding health and working conditions

– Develop schedules and have regular follow ups with team members

R10 Algorithms do not pan out in real-life All MPs Technical Schedule 3 4 – Test subsystems individually to assess if problem lies in software or is actually due to conversion from sim2real

Parts List

Can be Found here


Issues Log

Can be found here