- Fall Semester
For the Fall semester, the plan was to finish the hardware part of the system, synchronize the data from stereo camera and radar, then get the data from the sensors. The test conditions including the location, the equipment and the environment required are shown in Table. 1. The complete steps and performance requirements of the FVE are specified in Table. 2.
Table 1. Fall testing conditions
Location: The streets around the CMU campus Equipment: Test vehicle, Thule roof rack, two PointGrey Grasshopper3 cameras, one Delphi ESR 2.5 radar, camera housings, radar mount
Table 2. Test Plan Details
Experiment: | A | |
Description: | Verifying weatherproofing of camera enclosures and radar mount | |
Step: | Step description | Success condition |
A.1 | Splash water and blow dust towards the enclosure from different directions. | Camera enclosures both pass IP54 rating. No dust or water enters the enclosures. |
Experiment: | B | |
Description: | Verifying robustness of sensor rack and mounting method | |
Step: | Step description | Success condition |
B.1 | Fix the rack and sensors on the test vehicle and then measure the relative position of the sensors | The sensors successfully mount onto the car while the car is stationary. |
B.2 | Drive around the school for about 20 minutes. Test will include quick turns and stops, as well as driving over cobbled roadways. | Sensors do not fall off the car. |
B.3 | Measure the relative positions of the sensors after the driving test in step B.2 | The relative position of sensors change by less than 5 mm in any direction. |
Experiment: | C | |
Description: | Verifying stereo vision performance in adverse weather conditions | |
Step: | Step description | Success condition |
C.1 | Drive around for 20 minutes in fog. | The recorded data are not obscured and object detection and tracking is still possible up to at least 15 m. Visualize the stereo vision and shown |
C.2 | Drive around for 20 minutes in rain. | |
C.3 | Drive around for 20 minutes in snow. |
Experiment: | D | |
Description: | Verifying sensor synchronization | |
Step: | Step description | Success condition |
D.1 | Point cameras at a stopwatch and trigger them to see if they capture images at the same time. | The captured images from both cameras show the same time on the stopwatch. |
D.2 | Mount sensors onto the test vehicle and drive around for 20 minutes. Monitor different sets of data collected by the stereo camera and radar and compare the timeline of the data consistently. | Data from stereo camera and radar should be synchronized and can be displayed in real-time (less than 100ms delay) |
Experiment: | E | ||
Description: | Test the object detection algorithm accuracy and speed | ||
Step: | Step description | Success condition | |
E.1 | Using the images captured by stereo cameras from the previous test as input then give a detection and classification results | The algorithm shall detect and classify pedestrians and vehicles up to 70% accuracy The algorithm shall give the size of the objects with accuracy up to x% The algorithm shall give the results within 200ms | |
2. Spring Semester For spring semester, we planned to mainly verify our function requirements on the software components. The logistics are shown below, also with initial version of the schedule shown in Table 3. More Details on the updated schedule are shown in the Test Plan in PDF format on the Document page, .
Test Locations
- MRSD Lab
- Outdoor parking lot
- Streets around school
Test Personnel
- Amit Agarwal
- Harry Golash
- Yihao Qian
- Menghan Zhang
- Zihao (Theo) Zhang
Test Materials
- Testing Vehicle
- Grasshopper3 Cameras
- Delphi ESR 2.5 Radar
- Step-Up power distribution PCB
- Sensor mounts and fixtures
- Electromagnets and chargers
- UINSTONE 150W Power Inverter
- Laptop
- GPS
Table 3. Test Plan Schedule
Name | Capability Milestone
|
Associated Test
|
Associated System Requirements
|
Feb 15
(PR 8)
|
Radar data filtering
Calibrated stereo vision |
Test A | • Use multiple sensors |
Mar 1
(PR 9)
|
Object detection and tracking.
Ego-motion estimation. |
Test B
Test C |
• Detect and identify objects (pedestrians and vehicles)
• Classify objects (pedestrians and vehicles) • Estimate external vehicle motion and ego-motion |
Mar 22
(PR 10)
|
Filtered data from all sensors | Test A | • Use multiple sensors |
Apr 5
(PR 11)
|
Sensor fusion | Test D | • Use multiple sensors |
Apr 17
(PR 12)
|
Integrated systems | Test E | • Conduct full-range perception
• Perceive in real-time |