ITEM METADATA RECORD
Title: Learning the peg-into-hole assembly operation with a connectionist reinforcement technique
Authors: Nuttin, Marnix
Van Brussel, Hendrik #
Issue Date: 30-Aug-1997
Publisher: Elsevier Science BV
Series Title: Computers in Industry vol:33 issue:1 pages:101-109
Abstract: The paper presents a learning controller that is capable of increasing insertion speed during consecutive peg-into-hole operations, without increasing the contact force level. This work shows that the quality of an assembly operation can be improved using a learning approach. Our aim is to find a better relationship between measured forces and the controlled velocity. This relationship is improved without using a complicated, human generated model. Two learning phases are distinguished. First the learning controller is trained (or initialised) in a supervised way by a suboptimal controller. Then a reinforcement learning phase follows. We followed a connectionist approach. The controller consists of two neural networks: (i) the policy network, and (ii) the exploration network. On-line robotic exploration plays a crucial role in obtaining a better policy. Optionally, this architecture can be extended with a third network: the reinforcement network. The learning controller is implemented on a CAD-based contact force simulator. In contrast with most other related work, the learning controller works in 3 dimensions with 6 degrees of freedom. Performance of a peg-into-hole task is measured in insertion time and in average or maximum force level. The fact that a better performance can be obtained in this way demonstrates the importance of model-free learning techniques for repetitive robotic assembly tasks. The paper presents the approach and simulation results.
ISSN: 0166-3615
Publication status: published
KU Leuven publication type: IT
Appears in Collections:Production Engineering, Machine Design and Automation (PMA) Section
# (joint) last author

Files in This Item:

There are no files associated with this item.

Request a copy

 




All items in Lirias are protected by copyright, with all rights reserved.

© Web of science