Description

1.1 Project Details

Oculus Research, Pittsburgh has constructed a multi-sensor capture system consisting of a large number of cameras and microphones to perform motion tracking and 3D reconstruction with unprecedented accuracy. A critical component of achieving highly accurate 3D Reconstruction and motion tracking results is accurate sensor calibration, which is the focus of our project. The calibration process employs three methods, namely:

  • – Sensor Noise Calibration
  • – Color Calibration
  • – Geometric Calibration

Our team, in collaboration with Oculus VR envisioned a system shown in Figure 1. We use a ABB robotic arm to maneuver a specially designed 3D calibration target in the capture space in order to capture images covering most of the field of view of all the cameras. Our path planning algorithm ensures that we are able to achieve a coverage of at least 85% in the field of view of each camera while selecting the minimum number of points the ABB robot arm would need to cover. After collecting the images of the calibration target at these selected points, we perform the sensor noise calibration and the color calibration. In sensor noise calibration, we remove the fixed pattern noise from the images using dark frame subtraction and gain normalization. In color calibration, we use a standard ColorChecker chart to adjust the RGB gain channels to obtain the true color of the images. These steps are crucial for accurate 3D reconstruction. The final step is applying the geometric calibration algorithm on the images post sensor noise and color calibration. The average calibration accuracy (reprojection error) achieved by our team was less than 0.2 pixels.

 

 

1.2 Project Goals

To design a turnkey solution for robotic multi-sensor calibration of a capture space consisting of a large number of cameras, with an accuracy of less than 10 microns and a rotational accuracy of less than an arc second.