Project Management

Schedule

Scrum Timeline:

Over the course of the coming year, we will have a total of 11 two-week development sprints.

Detailed planning is broken into two major releases, centered around the two demonstrations in December 2015 and April 2016. Each sprint fits into the overall schedule in the following manner.

status

 

Project progress visualization:

The burndown chart below shows our trajectory with respect to our first release plan, due for completion in December 2015. The plan (blue line) represents an estimated level of focused, productive effort of 10 hours per week per team member.

This first release includes all planned work up until the first Fall demo (see test plan details below).

MRSD_burndown_nov23

NOTE: Burndown chart only shown for Fall semester due to focus on implementation for Spring semester.

Burndown chart annotations:

  • During the week of 10/23 remaining work spiked due to the inclusion of several missing work items related to the MRSD Project course deliverables.
  • Work remaining dropped significantly during the week of 11/21 due to the postponement of work related to autonomous control of the Iris+ quadcopter.

 

Work Breakdown Structure Summary as of 4/1/2016:

april_wbs_a

status_wbs

 

Presenter Schedule:

Fall:

October 22nd – Progress Review 1 – Presenter: Erik Sjoberg

October 29th – Progress Review 2 – Presenter: Cole Gulino

November 12th – Progress Review 3 – Presenter: Rohan Thakker

November 24th – Progress Review 4 – Presenter: Job Bedford

Spring:

Progress Review 7 – January 27th – Presenter: Rohan Thakker

Progress Review 8 – February 8th – Presenter: Erik Sjoberg

Progress Review 9 – February 24th – Presenter: Cole Gulino

Progress Review 10 – March 16th – Presenter: Job Bedford

Progress Review 11 – March 30th – Presenter: Rohan Thakker

Progress Review 12 – April 11th – Presenter: Erik Sjoberg

 

Test Plan

Selection_117

  1. Backup Iris+ Drone Hardware
  • Objective: The purpose of this test is to demonstrate a second complete, working Iris+ drone with all sensors and hardware integrated.
  • Elements: This test will feature the following elements:
    1. Hardware Subsystem – The complete and working hardware of the Iris+ drone
    2. ROS Framework – The high-speed integration of our Linux SBC (ROS) and the flight-control firmware running on the Pixhawk microcontroller
  • Location: MRSD Lab
  • Equipment:
    1. Iris+ Drone
    2. Laptop PC
  • Personnel:
    1. Erik Sjoberg – Erik will demonstrate the functioning powered-up hardware
  • Procedure:
    1. Power up drone from battery and demonstrate blinking lights on SBC and PX4 Flow camera
    2. Connect to drone from PC via Wifi network
    3. Connect to ROS master running on Iris+ SBC
    4. Display high-rate IMU and height sensor readings from Iris+ on PC
  • Verification Criteria:
    1. Iris+ SBC and PX4Flow sensor power up successfully
    2. Able to connect to Iris+ SBC over wireless from PC
    3. Able to see IMU data at over 150hz via rostopic hz
    4. Able to see appropriate depth readings from sonar sensor via rostopic echo
  1. AR.Drone functionality Complete
  • Objective: Showcase ability to command drone and detect April Tag
  • Elements: This test will feature the following elements:
    1. Hardware Subsystem – AR.Drone
    2. ROS Framework – The high-speed integration of our Linux SBC (ROS) and the flight-control firmware running on the Pixhawk microcontroller
  • Location: NSH B-level
  • Equipment:
    1. AR.Drone
    2. Laptop PC
    3. April Tag
  • Personnel:
    1. Job Bedford – Job will demonstrate the AR.Drone functionality
  • Procedure:
    1. Initiate drone setup packages drivers communication and physical placement
    2. Take off with drone, via mover node
    3. Showcase planar movement with commands to drone.
    4. Showcase April Tag detection pose and orientation estimates
  • Verification Criteria:
    1. Drone should take off with mover noe command
    2. Drone should obey all flight commands
    3. rosnode should grant accurate estimate of tag locations within a 1m error margin.
  1. Autonomous hovering with Iris+
  • Objective: The purpose of this test is to demonstrate the ability to autonomously hover with the IRIS+. This will help us in implementing the cone search and docking states.
  • Elements: This test will feature the following elements:
    1. Local Planning subsystem – to test the implementation of the “Hover In Plane” unit of the subsystem
    2. World Modelling subsystem – to test the implementation of the “Pose Estimation” unit of the subsystem
  • Location: NSH B-Level Basement
  • Equipment:
    1. Iris+ Drone
    2. Laptop
    3. RC remote of IRIS+ Drone
  • Personnel:
    1. Rohan Thakker – Rohan will be the operator who is required to run the commands for the drone.
    2. Erik Sjoberg- Manually control the drone using the RC remote
  • Procedure:
    1. Cordone off the area in the B-Level basement for safety.
    2. Under manual control, take off with the drone and move into desired initial position and orientation.
    3. Shift to autonomous control and run the command move the IRIS+ drone along a square trajectory.
    4. Observe as the Iris+ autonomously navigates along the predefined trajectory.
    5. Shift to manual control and land the robot
  • Verification Criteria:
    1. Iris+ was able to autonomously navigate the reference trajectory with less that +-2m of tracking error.
  1. AR.Drone Autonomous Docking
  • Objective: Showcase AR.Drone ability to land and dock with april tag detection.
  • Elements: This test will feature the following elements:
    1. Hardware Subsystem – AR.Drone
    2. April tags
  • Location: NSH B-level
  • Equipment:
    1. AR.Drone
    2. Laptop PC
    3. April Tag
  • Personnel:
    1. Job Bedford – Job will demo the AR.Drone autonomous docking
  • Procedure:
    1. Initiate drone setup packages drivers communication and physical placement
    2. Job will command drone to take off and position it above the april tags with offset
    3. Job will initiate the sequence
    4. Drone will track and position itself above tag.
    5. Drone will descend and land within designated marker
  • Verification Criteria:
    1. Drone should track and position itself above tag
    2. Drone should descend and land in area of April Tag, with landing gear in markers
    3. Marker will denote the level of accuracy in in docking.
  1. Autonomous docking with Iris+
  • Objective: The purpose of this test is to demonstrate the ability to autonomously docking with the IRIS+.
  • Elements: This test will feature the following elements:
    1. Local Planning subsystem – to test the implementation of the “Land” unit of the subsystem
    2. World Modelling subsystem – to test the implementation of the “Pose Estimation” unit of the subsystem using the APRIL Tag
    3. Tactical Planning – to test the implementation of “Attempt Docking” unit of the subsystem.
  • Location: NSH B-Level Basement
  • Equipment:
    1. Iris+ Drone
    2. Docking station with APRIL Tag
    3. Laptop
    4. RC remote of IRIS+ Drone
  • Personnel:
    1. Rohan Thakker – Rohan will be the operator who is required to run the commands for the drone.
    2. Erik Sjoberg – Manually control the drone using the RC remote
  • Procedure:
    1. Cordone off the area in the B-Level basement for safety.
    2. Place the docking station in the center of the area
    3. Under manual control, take off with the drone and move to a position above the docking station such that the APRIL Tag is in the field of view of the downward facing camera
    4. Shift to autonomous control and run the command to dock the drone
    5. Observe the IRIS+ attempt landing
    6. Repeat from step (iii.), there more times
  • Verification Criteria:
    1. IRIS+ was successfully able to land in the center of the funnels located on the docking station >1 out of four times
  1. Integrated AR.Drone Demo
  • Objective: Demo AR.Drone ability to complete SVE tasks. Merge task into continuous demo of take off, Tornado Search, Wellhead Alignment, Dock Alignment and Docking. This will complete our risk mitigation for the drone back up plan.
  • Elements: This test will feature the following elements:
    1. Hardware Subsystem – AR.Drone
    2. April tags
  • Location: NSH B-level
  • Equipment:
    1. AR.Drone
    2. Laptop PC
    3. April Tag
    4. Wellhead mock up
    5. Dock marker
  • Personnel:
    1. Job Bedford – Job will demo the AR.Drone integrated demo
  • Procedure:
    1. Initiate drone setup packages drivers communication and physical placement.
    2. Job will command drone to take off and position it in initial starting area.
    3. Job will initiate the sequence.
    4. Drone will proceed to tornado search staying within 7 meter demo area, until it detects wellhead marker.
    5. Drone will identify wellhead and align itself 1 meter in front of wellhead.
    6. Drone will then identify and align with dock.
    7. Drone will then descend and dock on dock.
  • Verification Criteria:
    1. Drone perform tornado search within 7 meter square area.
    2. Drone identifies and aligns itself 1 meter in front of wellhead.
    3. Drone descends and docks on dock.
    4. Drone landing gear will be within circular markers.
  1. Cone Search with Iris+
  • Objective: The purpose of this test is to showcase the system integration for two of the major functional elements of the project: “Search for and Approach Wellhead” and “Align Self with Wellhead”. The integration of these two functional areas is highly critical for the success of the project. By showcasing the integration of the subsystems involved with these two functional elements, we will showcase that two-thirds of the major system functionality has been fully implemented and integrated.
  • Elements: This test will feature the following elements:
    1. Vision subsystem – In order to recognize and lock on dock position, we must showcase that the vision system is able to recognize tages and get position estimates from them.
    2. Control subsystem – In order to send waypoint information for the predefined cone-search path, we must have accurate position control in order to have waypoint following.
    3. Navigation system – The entire navigation system will be tested for integration effectiveness. The subsystems involved: vision, sensor fusion, position control, localization, etc.
    4. Autonomous position locking – The system must be able to lock around a position in order to hover over the dock in preparation for landing. This integration test covers the same subsystems of the navigation system.
  • Location: NSH B-Level Basement
  • Equipment:
    1. Iris+ Drone
    2. Wellhead infrastructure with identifying tag
  • Personnel:
    1. Cole Gulino – will be the operator who is required to run the commands for the drone.
    2. Erik Sjoberg – will be the backup operator for manual control of the Iris+ in case of emergency.
  • Procedure:
    1. Cordone off the area in the B-Level basement for safety.
    2. Under manual control, take off with the drone and move into desired initial position and orientation.
    3. Run the command to commence the cone-search.
    4. Observe as the Iris+ autonomously navigates the predefined cone-search trajectory until it has located the wellhead and locked on the position of the dock next to it.
    5. Verify that the Iris+ has position lock with the dock within the tolerance specified in the Verification Criteria.
  • Verification Criteria:
    1. Iris+ has completed its cone-search maneuver.
    2. Iris+ is hovering over around the center of the tag on the wellhead with a tolerance of 0.5m in any direction.
  1. Integrate the subsystems
  • Objective: By showcasing the integration of the subsystems involved with these three functional elements, we will showcase that the integration for all three functional areas has been completed in the simplest case.
  • Elements: This test will feature the following elements:
    1. Vision subsystem – In order to recognize and lock on dock position, we must showcase that the vision system is able to recognize tages and get position estimates from them.
    2. Control subsystem – In order to send waypoint information for the predefined cone-search path, we must have accurate position control in order to have waypoint following.
    3. Navigation system – The entire navigation system will be tested for integration effectiveness. The subsystems involved: vision, sensor fusion, position control, localization, etc.
    4. Autonomous position locking – The system must be able to lock around a position in order to hover over the dock in preparation for landing. This integration test covers the same subsystems of the navigation system.
  • Location: NSH B-Level Basement
  • Equipment:
    1. Iris+ Drone
    2. Wellhead infrastructure with identifying tag
    3. Dock infrastructure with identifying tag
  • Personnel:
    1. Cole Gulino – will be the operator who is required to run the commands for the drone.
    2. Erik Sjoberg –  will be the backup operator for manual control of the Iris+ in case of emergency.
  • Procedure:
    1. Cordone off the area in the B-Level basement for safety.
    2. Under manual control, take off with the drone and move into desired initial position and orientation.
    3. Run the command to commence the cone-search.
    4. Observe as the Iris+ autonomously navigates the predefined cone-search trajectory until it has located the wellhead and locked on the position of the dock next to it.
    5. Verify that the Iris+ has position lock with the dock within the tolerance specified in the Verification Criteria.
    6. Verify that the Iris+ has successfully landed on the docking infrastructure.
  • Verification Criteria:
    1. Iris+ has completed its cone-search maneuver.
    2. Iris+ is hovering over around the center of the tag on the wellhead with a tolerance of 0.5m in any direction.
    3. Iris+ has docked with 5 DOF.
  1. Preliminary full system integration
  • Objective: The purpose of this test is to prepare a simplified version of our complete SVE. This will force us to complete the integration of our various sub-systems in preparation for the final demonstration. This demonstration will not have the polish of our final SVE, however it should contain each of the major required elements in more-or-less working order. Some minor intervention or restarts may be required, but will be allowed.
  • Elements: This test will feature the following elements:
    1. Vision subsystem – In order to recognize and lock on dock position, we must showcase that the vision system is able to recognize tages and get position estimates from them.
    2. Control subsystem – In order to send waypoint information for the predefined cone-search path, we must have accurate position control in order to have waypoint following.
    3. Navigation system – The entire navigation system will be tested for integration effectiveness.
    4. Autonomous position locking – The system must be able to lock around a position in order to hover over the dock in preparation for landing.
    5. Autonomous docking – The system will complete the full docking procedure.
  • Location: NSH B-Level Basement
  • Equipment:
    1. Iris+ Drone
    2. Wellhead infrastructure with identifying tag
    3. Dock infrastructure with identifying tag
    4. Warning tape
  • Personnel:
    1. Erik Sjoberg – Erik will be the operator who is required to run the commands for the drone.
  • Procedure:
    1. Cordon off section of hallway
    2. Place wellhead at one corner of search area and dock 1m in front of the wellhead
    3. Place Iris+ on ground at opposite corner of search area facing wellhead within +/- 5 degrees
    4. Hit START button on PC to initiate sequence
    5. Confirm Iris+ lifts off and begins searching for wellhead (marker)
    6. Confirm Iris+ arrives within 3 meter radius of wellhead
    7. Confirm Iris+ orients above dock in pre-docking position (within 1 meter of dock)
    8. Confirm Iris+ successfully lands in dock, constrained in 5 DOF
    9. Restart  from procedure iii. if unsuccessful
  • Verification Criteria:
    1. Iris+ autonomously takes off from ground
    2. Iris+ arrives within 3 meter radius of wellhead
    3. Dock with docking station, constrained in 5 DOF
    4. Multiple attempts along with some manual intervention will be allowed
  1. SVE Preview Demonstration
  • Objective: The purpose of this test is to prepare a more-or-less complete version of our SVE. This will be the first demonstration which requires a fully working system without manual intervention. Multiple restarts may be required, but the system must work autonomously.
  • Elements: This test will feature the following elements:
    1. Vision subsystem – In order to recognize and lock on dock position, we must showcase that the vision system is able to recognize tages and get position estimates from them.
    2. Control subsystem – In order to send waypoint information for the predefined cone-search path, we must have accurate position control in order to have waypoint following.
    3. Navigation system – The entire navigation system will be tested for integration effectiveness.
    4. Autonomous position locking – The system must be able to lock around a position in order to hover over the dock in preparation for landing.
    5. Autonomous docking – The system will complete the full docking procedure.
  • Location: NSH B-Level Basement
  • Equipment:
    1. Iris+ Drone
    2. Wellhead infrastructure with identifying tag
    3. Dock infrastructure with identifying tag
    4. Warning tape
  • Personnel:
    1. Cole Gulino – who will be the operator who is required to run the commands for the drone.
    2. Erik Sjoberg – will be a standby operator to manually control drone in an emergency situation
  • Procedure:
    1. Cordon off section of hallway
    2. Place wellhead at one corner of search area and dock 1m in front of the wellhead
    3. Place Iris+ on ground at opposite corner of search area facing wellhead within +/- 5 degrees
    4. Hit START button on PC to initiate sequence
    5. Confirm Iris+ lifts off and begins searching for wellhead (marker)
    6. Confirm Iris+ arrives within 3 meter radius of wellhead
    7. Confirm Iris+ orients above dock in pre-docking position (within 1 meter of dock)
    8. Confirm Iris+ successfully lands in dock, constrained in 5 DOF
    9. Restart  from procedure iii. if unsuccessful
  • Verification Criteria:
    1. Iris+ autonomously takes off from ground
    2. Iris+ arrives within 3 meter radius of wellhead
    3. Dock with docking station, constrained in 5 DOF

Spring Validation Experiment

Needed Equipment: Iris+ with hardware, wellhead, dock, caution tape, blast shields

Operational Area: 25m2 in B – Level Basement

Test Process:

  1. Cordon off section of hallway and place blast shields to protect spectators
  2. Place wellhead at one corner of search area and dock 1m in front of the wellhead
  3. Place Iris+ on ground at opposite corner of search area facing wellhead within +/- 5 degrees
  4. Hit START button on PC to initiate sequence
  5. Confirm Iris+ lifts off and begins searching for wellhead (marker)
  6. Confirm Iris+ arrives within 3 meter radius of wellhead
  7. Confirm Iris+ orients above dock in pre-docking position (within 1 meter of dock)
  8. Confirm Iris+ successfully lands in dock, constrained in 5 DOF

Success Conditions:

Mandatory:

  1. Iris+ autonomously takes off from ground
  2. Iris+ arrives within 3 meter radius of wellhead
  3. Dock with docking station, constrained in 5 DOF

Desired:

  1. Dock constraints 6 DOF
  2. Successfully avoid obstacles

Selection_118

Parts List

Date RequestedPart No.Part NameQuantityTotal PriceWebsite Link
10/15/20153DR IRIS+ Quadcopter1$599.99http://store.3drobotics.com/products/iris
10/15/20153DR IRIS+ Propellers4$39.96http://store.3drobotics.com/products/iris-plus-propellers
10/10/2015595-MINNOWMAX-DUALMINNOWBOARD-MAX-DUAL1$145.95http://www.mouser.com/ProductDetail/CircuitCo/MINNOWBOARD-MAX-DUAL/?qs=sGAEpiMZZMs9lZI8ah3py%2f9KKP2eiFfZsbAbf7dOFFRyaqaqEqmd8g%3d%3d
10/25/2015Odroid XU-4 Board1$83.00http://ameridroid.com/products/odroid-xu4
11/5/2015PX4Flow1$149.00http://store.3drobotics.com/products/px4flow
11/5/2015Iris+ Battery2$80.00https://store.3drobotics.com/products/iris-plus-battery
11/18/2015NicaDrone Perment Magnet2$90.00http://nicadrone.com/index.php?id_product=59&controller=product
12/14/20153DR IRIS+ #21$599.99http://store.3drobotics.com/products/iris
12/14/2015PX4Flow #21$149.00http://store.3drobotics.com/products/px4flow
12/14/2015Asus Xtion Pro Live1$329.99http://www.ebay.com/itm/NEW-ASUS-Xtion-PRO-LIVE-B-U-RGB-and-Depth-Sensor-/291637793920?hash=item43e6f79480:g:fJwAAOSwxN5WbQnp
10/25/2015Odroid XU-4 Board1$83.00http://ameridroid.com/products/odroid-xu4
1/19/2016Intel RealSense Camera1$99.00http://click.intel.com/intel-realsense-developer-kit-r200.html

Total estimated cost: $3170.52

Remaining budget: $829.48

Issues Log

Issues are maintained on the github page:
Column Repository: Column Repository Issues