Skip to main content

Open Access SARAFun, Smart Assembly Robot with Advanced FUNctionalities, H2020

Download Article:
While Industrial robots are very successful in many areas of industrial manufacturing, assembly automation still suffers from complex time consuming programming and the need of dedicated hardware. ABB has developed YuMi, a collaborative inherently safe assembly robot that is expected to reduce integration costs significantly by offering a standardized hardware setup and simple fitting of the robot into existing workplaces. Internal Pilot testing at ABB has however shown that when YuMi is programmed with traditional methods the programming time even for simple assembly tasks will remain very long. The SARAFun project has been formed to enable a non-expert user to integrate a new bi-manual assembly task on a YuMi robot in less than a day. This will be accomplished by augmenting the YuMi robot with cutting edge sensory and cognitive abilities as well as reasoning abilities required to plan and execute an assembly task. The overall conceptual approach is that the robot should be capable of learning and executing assembly tasks in a human like manner. Studies will be made to understand how human assembly workers learn and perform assembly tasks. The human performance will be modelled and transferred to the YuMi robot as assembly skills. The robot will learn assembly tasks, such as insertion or folding, by observing the task being performed by a human instructor. The robot will then analyze the task and generate an assembly program, including exception handling, and design 3D printable fingers tailored for gripping the parts at hand. Aided by the human instructor, the robot will finally learn to perform the actual assembly task, relying on sensory feedback from vision, force and tactile sensing as well as physical human robot interaction. During this phase the robot will gradually improve its understanding of the assembly at hand until it is capable of performing the assembly in a fast and robust manner.

Keywords: Assembly; Assembly of two tasks; Automated 3D finger generation; Automatic finger generation; Bi-manual robot; Customized grasping; H2020; Human robot interaction; Industrial; Industry; Knowledge integration; Learning; Learning by demonstration; Learning by doing; Robotic Assembly; Robotic control; Robotic sensors; Robotics for manufacturing; Robotics, Assembly, Industrial, Industry, H2020, Robotic Assembly, Teaching by demonstration, Learning by doing, Automatic finger generation, Human robot interaction, Knowledge integration, Robotic sensors, Learning, development and adaptation, Robotics for manufacturing, Robotic control, Bi-manual robot, Assembly of two tasks, Learning by demonstration, Automated 3D finger generation, Vision-based and tactile sensing for learning object manipulation, Customized grasping; Teaching by demonstration; Vision-based and tactile sensing for learning object manipulation; development and adaptation

Document Type: Research Article

Publication date: 01 June 2017

More about this publication?
  • Impact is a series of high-quality, open access and free to access science reports designed to enable the dissemination of research impact to key stakeholders. Communicating the impact and relevance of research projects across a large number of subjects in a content format that is easily accessible by an academic and stakeholder audience. The publication features content from the world's leading research councils, policy groups, universities and research projects. Impact is published under a CC-BY Creative Commons licence.

  • Subscribe to this Title
  • Terms & Conditions
  • Disseminating research in Impact
  • Information about Impact
  • Ingenta Connect is not responsible for the content or availability of external websites
  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content