System Implementation

Robotic Arm

The robotic arm system will be mounted on top of the mobile platform. It needs to be able to reach the entire width of the pan—about 2 meters—while maintaining manipulability. After looking into several options for how to proceed we made our design decision. The subsystem will consist of a 1-meter-long linear actuator and a UFactory 850 arm. The robotic arm from UFactory will be placed on top of the linear actuator to extend its reach.

We will be using the MoveIt package inside our ROS2 environment to plan trajectories for both general maneuvering and cleaning. MoveIt allows us to import custom objects into the scene, so the planner is aware of the arm’s surroundings. MoveIt is also very flexible, allowing us to build a custom plugin for the inverse kinematics of the arm. Because the arm only has 6 degrees of freedom, there are finite inverse kinematics solutions in nearly all potential poses. Based on the drip pan environment, we can select the IK solution that will perform the best. This will make our trajectories more repeatable, reliable, and quickly solvable.

Arm motion simulated in RViz

To further improve the realism of the simulation, the cleaning end-effector was also added to the computer environment. Coupled with trajectory generation scripts, we can see the arm sliding the cleaning tool over the surface of the pan and towards the drain, as intended.

We also integrated visual information to the arm’s movement: by attaching a camera to the rig, we can identify creosote positions on the pan. These positions are then sent to the arm for cleaning.

The arm in simulation identifying creosote on the pan

We also worked on the linear actuator that moves the arm on top of the mobility platform. Because of the linear actuator, we are able to give the arm a much longer reach and cover a larger area of the pan. The actuator was also coupled with limit switches, which prevent the arm from damaging the actuator’s edges.

Linear actuator section, with limit switch at the edge

Mobile Platform

Our mobility platform is designed to traverse the drip pan over the rails. Our arm is mounted on a linear actuator the runs laterally on the robot. After researching various materials to construct the chassis, we landed on 80/20s because they are relatively inexpensive, easy to assemble, and allow for simple mounting of other hardware and physical parts. The dimensions of the mobility platform are roughly 1.2 meters to .5 meters. The mobility platform is driven by a single stepper motor with a worm drive gearbox.

Over the course of the Spring semester, we assembled the core parts of the mobile platform, which included buying, machining and integrating several pieces in order to obtain a movable robot. The platform is now able to move on top of the rails and to support the linear actuator that will eventually move the arm. We also attached a distance sensor to its end in order to detect the end of the rails, which prompts the robot to stop its movement.

The core of our mobile platform on our mockup of the drip pan

We also included cliff sensors on the platform to ensure it would not fall of the rails. When the sensor, which is offset from the platform, detects a void, it tells the system to break.

Cliff detector sensor

Cleaner End Effector

We started developing this subsystem by evaluating the options Koppers had already been using to clean creosote. These included metal scrapers and chemically-resistant squeegees.

We soon noticed these methods were suboptimal. They failed to displace older, crystallized creosote and had to be paired with chemicals in order to achieve their objectives. As such, we set out to test new alternatives to these tools.

After research, we enumerated some possible candidates for this role, including a heated scraper used for beekeeping and a squeegee with Viton rubber at its edge. To test these candidates, we brought them to the Koppers R&D center at Harmarville, PA, where we ran a number of experiments to subject these tools to different conditions of heat, creosote and cleanliness.

Test run of Viton squeegee

So far, the best tool by far was the Viton squeegee. But since we were unable to mimic the worst conditions of creosote that we saw on the factory, we have sent the squeegee to NLR for further assessment.

The results from the factory indicate that Viton is a very good choice for a cleaner, as it displaces creosote with efficiency. It should be remarked, though, that it was still unable to remove crystallized creosote without the help of chemicals, but our sponsor said that would not be a problem.

Based on that feedback, we made a CAD model that can be integrated with the robot arm, the camera and the Viton. The geometry of the end-effector was designed to overcome the complex geometry of the drip pan.

CAD model of the end-effector. Includes Viton slot, camera mount and load cell for force feedback

Perception/Computing

For the perception sub-system, we started by analysing what kind of images and input we are dealing with. From our factory visit last semester, we had captured a few images portraying the kind of cresote-covered surfaces we expect to run identification algorithms on. The textures we found varied, caused by different amounts of dust and wooden fibers. Even the lighting conditions are expected to vary as it is an industrial setting with some exposure to the environmental elements. Further, the lack of any consistency across the image features restricts us from going forward with learning-based methods for the semantic segmentation of the input scenes. This is further aggravated by the lack of sufficient data for such methods.

For the purpose of defining the flow of our system and how image capturing integrated into the overall procedure, we identified the segments in which the pan will be captured by the camera installed on the robot. Using this criteria, we identified some of the images which could suit us for developmental purposes, from the ones we captured during our visit at the factory. Our initial attempt involved running Otsu’s thresholding on the same – the results were encouraging even witht he relatively simpler method, with most of the creosote being identified. The results further improved with minimal preprocessing involving erosion of the input image to soothern the erratic glare spots in the image due to the respectic properties of creosote and extremely granular texture at various points in the image due to particle infusions.

The proceeding challenge involved indentification and segmentation of three classes: thick black creosote layer, fine golden film of some creosote, and cleaned surface which would be silver in colour. The two methods we have tried thus far are: 1) Using different channels for imaging, like HSV and LAB, and 2) Color based thresholding to separate the three colour. In the first method, we convert the input RGB image into other imaging channels and try to capture what works best for the segmentation task. For this method, we found the LAB channel format, with some erosion to work the best. In the second method, we first convert the input image to B&W channel and then try to segment out black, grey and white as separate masks. Both of the methods worked on our preliminary data.

Our detection of the creosote was improved even further by experimenting with adaptive thresholding segmentation, which made our vision stack more robust to changes in lighting and creosote texture. With this method, we can have a much higher confidence that the results obtained with our testbed can be generalized to the NLR factory.

Adaptive thresholding results