Skip to main content
padlock icon - secure page this page is secure

Open Access Automatic Estimation of the Position and Orientation of the Drill to Be Grasped and Manipulated by the Disaster Response Robot Based on Analyzing Depth Camera Information

Download Article:
 Download
(PDF 1,493.3 kb)
 
Towards the actualization of a disaster response robot that can locate and manipulate a drill at an arbitrary position with an arbitrary posture in disaster sites, this paper proposes a method that can estimate the position and orientation of the drill that is to be grasped and manipulated by the robot arm, by utilizing the depth camera information acquired by the depth camera. In this paper’s algorithm, first, using a conventional method, the target drill is detected on the basis of an RGB image captured by the depth camera, and 3D point cloud data representing the target is generated by combining the detection results and the depth image. Second, using our proposed method, the generated point cloud data is processed to estimate the information on the proper position and orientation for grasping the drill. More specifically, a pass through filter is applied to the generated 3D point cloud data obtained by the first step. Then, the point cloud is divided, and features are classified so that the chuck and handle are identified. By computing the centroid of the point cloud for the chuck, the position for grasping is obtained. By applying Principal Component Analysis, the orientation for grasping is obtained. Experiments were conducted on a simulator. The results show that our method could accurately estimate the proper configuration for the autonomous grasping a normal-type drill.
No References for this article.
No Supplementary Data.
No Article Media
No Metrics

Keywords: depth camera; disaster response robot; point cloud data

Document Type: Research Article

Publication date: January 13, 2019

This article was made available online on January 13, 2019 as a Fast Track article with title: "Automatic estimation of the position and orientation of the drill to be grasped and manipulated by the disaster response robot based on analyzing depth camera information".

More about this publication?
  • For more than 30 years, the Electronic Imaging Symposium has been serving those in the broad community - from academia and industry - who work on imaging science and digital technologies. The breadth of the Symposium covers the entire imaging science ecosystem, from capture (sensors, camera) through image processing (image quality, color and appearance) to how we and our surrogate machines see and interpret images. Applications covered include augmented reality, autonomous vehicles, machine vision, data analysis, digital and mobile photography, security, virtual reality, and human vision. IS&T began sole sponsorship of the meeting in 2016. All papers presented at EIs 20+ conferences are open access.

    Please note: For purposes of its Digital Library content, IS&T defines Open Access as papers that will be downloadable in their entirety for free in perpetuity. Copyright restrictions on papers vary; see individual paper for details.

  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content
Cookie Policy
X
Cookie Policy
Ingenta Connect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more