An intelligent real-time tracking and grasping system for a robotic work cell
A new adaptive linear robot control system for a robot work cell that can visually track and intercept stationary and moving objects undergoing arbitrary motion anywhere along its predicted trajectory within the robot's workspace is presented in this paper. The proposed system was designed by integrating a stationary monocular CCD camera with off-the-shelf frame grabber and an industrial robot operation into a single application on the MATLAB platform. A combination of the model based object recognition technique and a learning vector quantization network is used for classifying stationary objects without overlapping. The optical flow technique and the MADALINE network are used for determining the target trajectory and generating the predicted robot trajectory based on visual servoing, respectively. The necessity of determining a model of the robot, camera, all the stationary and moving objects, and environment is eliminated. The location and image features of these objects need not be preprogrammed, marked and known before, and any change in a task is possible without changing the robot program. After the learning process on the robot, it is shown that the KUKA robot is capable of tracking and intercepting both stationary and moving objects at an optimal rendezvous point on the conveyor accurately in real-time.
No Reference information available - sign in for access.
No Citation information available - sign in for access.
No Supplementary Data.
No Article Media