Efficient model-based tracking for robot vision
This paper proposes a real-time, robust and efficient three-dimensional (3D) model-based tracking algorithm. A virtual visual servoing approach is used for monocular 3D tracking. This method is similar to more classical non-linear pose computation techniques. A concise method for derivation
of efficient distance-to-contour interaction matrices is described. An oriented edge detector is used in order to provide real-time tracking of points normal to the object contours. Robustness is obtained by integrating a M-estimator into the virtual visual control law via an iteratively
re-weighted least-squares implementation. The method presented in this paper has been validated on several visual servoing experiments considering various objects. Results show the method to be robust to occlusion, changes in illumination and mis-tracking.