PROJECT MANAGEMENT

Schedules

Spring semester schedule

Fall semester schedule

Fall Validation Demonstration

Location: AI Makerspace

Equipment: Hello Robot’s Stretch, gimbal mount, a display device, user device, Wi-Fi network, objects for grasping, dynamic obstacles in the environment

Objective: Demonstrate that the user can teleoperate and autonomously navigate the robot to the desired location, detection of pre-defined objects, autonomous grasping/placement of pre-defined objects, manually controlled gimbal motion

Autonomous Navigation & Teleoperation for Touring

  1. Procedure:
    1. Place the robot at the start location and observe the robot localize.
    2. Pick a goal for the robot on the interface and observe the robot plan the path.
    3. Observe the robot autonomously navigating to the goal while avoiding static and dynamic obstacles.
    4. Allow the user to control the pan-tilt display during the autonomous traversal of the robot to view the surroundings. 
    5. On reaching the location, teleoperate the robot to tour the AI Makerspace and observe the robot alert the user about obstacles during teleoperation.
    6. Control the pan-tilt display through the user interface to observe the surroundings and observe audio-visual communication between the remote user and the people in the environment.
  2. Validation:
    1. The user can pass the location to tour the robot, through the user interface.
    2. The robot is able to localize within the map and plan global and local paths.
    3. The robot is able to autonomously navigate to the goal with obstacle avoidance.
    4. The user is able to teleoperate the robot and control the pan-tilt display mounted on the robot for interaction.
    5. The robot is able to detect obstacles around it and alert the user.

Autonomous Pick and Place of Souvenirs

  1. Procedure:
    1. After touring, select the souvenir shop to autonomously navigate the robot to it.
    2. Select the picking option on the interface to select the souvenir to be picked up.
    3. Observe the robot detects the souvenirs and returns the list to the interface.
    4. Select the souvenir to be picked through the interface.
    5. Observe the robot autonomously pick up the selected souvenir and drop it in a box at the counter to be shipped.
  2. Validation:
    1. The robot is able to autonomously navigate to the souvenir shop.
    2. The user is able to teleoperate the robot inside the souvenir shop.
    3. The robot is able to detect the souvenirs in its field of view.
    4. The user is able to select the souvenir to be picked up from the interface.
    5. The robot is able to autonomously pick up the selected souvenir and place it in a shipping container on the counter.


Test plan

Fall semester

DatePRMilestone(s)Test(s)Requirements
09/29/228Bring up the autonomous navigation stack and ensure it is working as it was in SVD.Implement object detection for shipping box Implement autonomous manipulation pipeline for placing an object in the boxDevelop user interface with Audio/Video communicationTest 2,Test 7MF8, MF9
10/13/229Re-map the hallway (due to change in environment) and integrate with the autonomous navigation stack.Fine tune user interface + implement asynchronous communicationRefine grasping pipeline for multiple objectsImplement object detection for souvenir objectsTest 3,Test 1,Test 8MF1, MF2, MF8, MF9
11/03/2210Design servo actuator bracket, test E-kill with a new app, Update firmware for CMU DEVICE and switch to failback wifi access points in case of the lost primary network.Tune the autonomous navigation stack and the stack with the new map. Integrate all-subsystemsTest 10, Test 5MF1, MF2
11/17/2211Assemble Pan/Tilt display on TouRI, Test teleoperation.Integrate obstacle detection using lidar during teleoperation Test data flow throughout the systemTest 9, Test 6Test 4MF1, MF2, MF10, MF11
11/27/22FVDDemonstrate full system functionalityAllAll

Spring semester

ActionDemonstrationRequirements validated
1User starts the TouRI robot app and selects a location to tourDemonstrate user interaction with the app and the capability to receive location input from user within 5 seconds and generate the goal waypoint 
Condition for operation :  > 100mbps broadband connectivity
MP3, MNF1
2Based on the waypoint input, the navigation module generates a global path to the goal locationDemonstrate capability of the navigation module to generate the global path within 3 minutesMP4
3The robot detects static and dynamic obstacles and avoids them by following the obstacle avoidance trajectory genereated by the local plannerDemonstrate obstacle detection and avoidance by the robot with an mAP of 60%
Condition for operation: For obstacles lying in the FOV of sensing modalities
MP5
4Robot reaches the goal locationDemonstrate ability of the robot to reach the desired location with a speed of 0.4m/s during the traversal
Condition for operation: Limited obstacles are present in the surroundings and a path exists for the robot to avoid obstacles and traverse ahead
MP1
5The robot detects the doorknob and estimates its grab pointDemonstrate doorknob detection and the grab point estimation with an mAP of 40%
Condition for operation: For predefined set of doorknobs in the environment and appropriate lighting
MP6
6Based on the grab point input, the manipulation module plans the trajectory to open the doorDemonstrate the motion planning capability of the manipulation module to generate the trajectory within 3 minutesMP7
7The manipulator is simulated on ROS and it executes the planned trajectoryDemonstrate manipulator motion to open doors (simulation in Gazebo) 
8The standalone gimbal mount rotates the display device to view the surroundingsDemonstrate gimbal motion of 60 degree pitch and 120 degree yaw for display deviceMP9
9The app provides the video call interface between user and surroundingsDemonstrate 1080*720 resolution video call with a lag less than 2 secondsMP10

Fall semester

ActionDemonstrationRequirements validated
1User starts the TouRI robot appDemonstrate user-friendly app interfaceMNF2
2Based on the location input entered by the user in the app, goal waypoint for the location is generated and sent to the robotDemonstrate integration between user interface and navigation modules 
3During autonomous traversal, the robot detects static and dynamic obstacles and avoids them by following the obstacle avoidance trajectory generated by the local plannerDemonstrate obstacle detection and avoidance by the robot with an mAP of 80%
Condition for operation: For obstacles lying in the FOV of sensing modalities during autonomous traversal
MP5
4The robot detects the doorknob on reaching the desired location and estimates its grab pointDemonstrate doorknob detection and the grab point estimation with an mAP of 60%
Demonstrate integration between navigation and perception modules
Condition for operation: For predefined set of doorknobs in the environment and appropriate lighting
MP6
5The manipulator executes the planned trajectory to open the doorDemonstrate integration between perception and manipulation modules
Demonstrate manipulator motion to open doors within 5 minutes with 75% accuracy
MP8
6Robot successfully enters the desired room locationDemonstrate integration between user-interface, navigation, perception and manipulation modules for task execution in 30 minutes
Condition for operation: Goal location in the operating area and 100m away from start location of the robot
MP2
7On entering the desired room, robot accepts inputs from user for teleoperated traversalDemonstrate integration between user interface and navigation modules 
8The robot retracts teleoperation control from user when obstacles are detected and transitions to autonomous mode . The robot detects and avoids static and dynamic obstacles by following the obstacle avoidance trajectory generated by the local plannerDemonstrate situational awareness
Demonstrate obstacle detection and avoidance by the robot with an mAP of 80%
Condition for operation: For obstacles lying in the FOV of sensing modalities during teleoperated traversal
MP5,MNF4
9The gimbal mount integrated on the robot rotates the display device for user to view the surroundingsDemonstrate integration of gimbal platform with the robot
Demonstrate natural interaction
MNF5
10System level requirementsDemonstrate sub-system modularity with APIsMNF3

Parts list

We have created a google sheet to keep track of our parts list. Click here to view the sheet.


Issues log

Issue NumberDate FoundDate FixedChange Made ByOriginDescriptionResolutionArtifact(s) Changed
0013rd Jan, 202214th Jan, 20222Prakhar Pradeep
(ppradee2)
Manipulation testManipulator is not capable of performing tasks without moving the baseChanged the manipulation subsystem to integrate navigation for small distancesSoftware architecture (ROS package)
00228th Jan, 202215th Feb, 2022Prakhar Pradeep
(ppradee2)
Tele-op performance testInterface would not reset the joystick X & Y as (0,0) on releasing the joystick (bug in the third party package)Implemented a custom gesture detector to listen to release of joysticksUser interface source code
00320th Feb, 202227th Feb, 2022Shruti Gangopadhyay
(sgangopa)
Robot localisation testIssues with the set up of the base packages of the robotCorrecting the issues with set up, like transforms, package integration, etc.Base set up packages
0045th March, 202223rd Mar, 2022Shruti Gangopadhyay
(sgangopa)
Robot motion testRobot translates well on carpet, but accumulates offset during rotation on carpetIntegration of IMU sensor for odometryNavigation stack
00527th Mar, 2022Shruti Gangopadhyay
(sgangopa)
Full autonomous navigation stack testDrift in IMU causes robot to lose localisation when stationary, thus also affecting path planning to goal locationNot yet resolved
0068th Feb,202214th Feb, 2022Shivani Sivakumar (ssivaku3)MoveIT package integration testNot able to integrate the MoveIT package for the manipulator as it is not supported on ROS, only on ROS2Went ahead to work with rhe built in helper packages for the robot for the manipulation planning
00715th Mar, 2022Shivani Sivakumar (ssivaku3)Manipulation testLaunch file keeps crashing when trying to test manipulatonNot yet resolvedManipulation Stack
00823rd Mar, 20226th Apr, 2022Shivani Sivakumar (ssivaku3)Manipulation TestWrist is sometimes not able to extend when testing full manipulation stack involving base navigation and manipulator planningWriting our own scripts for actuation without depending much on the helper packagesManipulation Stack
00931st Apr, 20227th May, 2022Jashkumar Diyora (jdiyora)Robot HardwareThe robot gripper and wrist could not ping and respond to software commands.Changed baud rates57600 -> 115200
01016th May, 202220th May, 2022Jashkumar Diyora (jdiyora)Robot HardwareBattery over drain causing intel NUC to go into a deep sleep Hello Robot provided hardware battery protection before the power distribution inside the system.Battery over-drain protection was added.
0113rd Oct, 202217th Oct, 2022Jashkumar Diyora (jdiyora)Robot FirmwareThe robot has a software issue due to package dependencies causing multiple issues. A decision was made to wipe and reinstall OS on the robot. As the company updated the OS, many software and naming conventions were also updated, causing multiple issues with the previously developed dynamixel software. All the dependencies cited to the core of RE1 were reprogrammed for REx (As the same software works for RE1 and RE2)RE1 -> REx
01219th April24th AprilShivani Sivakumar (ssivaku3)
Manipulation test
FUNMAP kept crashing the robotWriting our own planner based for manipulator and baseManipulation Stack
01323rd April30th AprilShivani Sivakumar (ssivaku3)
Jigar Patel (jkpatel)
Perception TestError when trying to reach centroid of objectIncorporated visual feedbackPerception stack
014Oct 28,2022Nov5, 2022Shivani Sivakumar (ssivaku3)
Jigar Patel (jkpatel)
Perception Test3D pipeline is very slow hence entire dropping/picking pipeline takes timeRewrote 3D pipeline as a service in C++ instead of python.Perception/Manipulation stack
015Oct 28,2022Nov 6, 2022Shivani Sivakumar (ssivaku3)
Jigar Patel (jkpatel)
Perception Test2D pipeline slow in inferenceSet up server/client architecture for detections on local machine gpuPerception/Manipulation stack
016Oct 8, 2022Oct 13, 2022Shruti Gangopadhyay (sgangopa)Autonomous Navigation TestRobot loses localisation while traversing autonomously Issue due to featureless hallways. Tuned the parameters for mapping and for localisation, increasing the trust on odometryNavigation stack
017Oct 25, 2022Oct 27, 2022Shruti Gangopadhyay (sgangopa)Autonomous Navigation TestRobot finds it hard to localise at the goal Added features near the goals to uniquely identify the goals helps in localisation near the goals. Also increased the goal tolerance slightlyNavigation stack

RISK MANAGEMENT