Task Learning and Execution for Behavior-Based Mobile Manipulation (Taken leren en uitvoeren voor gedragsgebaseerde mobiele manipulatie)
Author:
Abstract:
Automation and robotics technology nowadays not only exists in labs or factories but enters more and more into our daily lives. Robots are expected to perform useful services for humans while navigating in the domestic environment. However, the requirements posed to robots that are to operate in such scenario are quite different from the requirements for robots that work in traditional industrial settings. This calls for a different design concept to allow robots to perform various tasks in such unstructured environment. Three stages in robot application design can be distinguished: a system development stage, an application specification stage, and an autonomous execution stage. Each stage has different requirements for its corresponding stakeholders and needs different solutions. In this thesis, a behavior-based design concept is applied to the control of a mobile manipulator to come up with a better robot control in dynamically changing environments. Behaviors are considered as the basic building blocks to construct more complex tasks by human demonstration.In the past, the behavior-based concept has proved to be suitable for mobile robots operating in dynamically changing environments. However, applying this design concept to complex applications, such as mobile manipulators, remains a big challenge. In contrast to mobile robots, manipulators are usually required to operate in direct contact with the environment and therefore require higher bandwidth and rely on real-time control. Also, directly applying the behavior-based concept on a mobile manipulator fails to account for redundancy resolution of the robotic system. Therefore, this thesis uses a constraint representation, which allows a flexible specification of control tasks, as a unified interface to program an individual behavior. In this framework, behaviors generate constraint parameters instead of actuator set points. A constraint-based behavior fusion mechanism (CBFM) produces the final output command as a solution of an optimization problem. This fusion mechanism allows a behavior to concentrate on achieving its own objective without needing to deal with the coordination with other behaviors. This new approach combines the behavior-based concept with the concept of constraint-based motion control used to program manipulators in a fundamental way while considering re-usability. Moreover, optimization agents can be added to optimize performance aspects that are not directly related to the task itself. The proposed novel fusion mechanism is evaluated and demonstrated by behavior-based obstacle avoidance and door opening experiments.Next to simply executing robot programs, laymen should have the opportunity to easily alter the robot task. This allows them to deal with the large variety of tasks that arise in everyday lives. In the behavior-based design concept, behaviors are defined as distributed representations of small sense-act units. They can be used as basic building blocks to compose more complex tasks. A complete task in a behavior-based system can be specified by a behavior diagram. Such a diagram describes a sequence of behaviors alongside transition conditions that determine when which behaviors should be activated. Task execution boils then down to a process of executing a predefined behavior sequence. Task learning is the process of recognizing this sequence and transition conditions. In this work it is shown how the complexity of a behavior diagram can be reduced due to the proposed fusion component (CBFM). As a result, task learning problem in behavior-based systems for complex mobile manipulation tasks is shown to be a feasible problem and is formalized as a behavior recognition problem.This thesis presents a generic and systematic approach for task learning and execution on behavior-based mobile manipulation. The proposed behavior-based task learning approach is a combination of feature extraction, feature mapping, and machine learning techniques. Various features are defined and used to convert sensory data to feature space and efficiently extract significant and meaningful information from sensory data. Behaviors are recognized by static classifiers, e.g., a support vector machine (SVM) operating in feature space. To validate this theory a set of interesting tasks was conducted by human demonstration and then executed by the mobile manipulator in an unstructured environment. These successful experiments validate the proposed approaches and close the loop of behavior-based task learning and execution.