Hierarchical Task-Parameterized Learning from Demonstration for Collaborative Object Movement

Joint Authors

Hu, Siyao
Kuchenbecker, Katherine J.

Source

Applied Bionics and Biomechanics

Issue

Vol. 2019, Issue 2019 (31 Dec. 2019), pp.1-25, 25 p.

Publisher

Hindawi Publishing Corporation

Publication Date

2019-12-02

Country of Publication

Egypt

No. of Pages

25

Main Subjects

Biology

Abstract EN

Learning from demonstration (LfD) enables a robot to emulate natural human movement instead of merely executing preprogrammed behaviors.

This article presents a hierarchical LfD structure of task-parameterized models for object movement tasks, which are ubiquitous in everyday life and could benefit from robotic support.

Our approach uses the task-parameterized Gaussian mixture model (TP-GMM) algorithm to encode sets of demonstrations in separate models that each correspond to a different task situation.

The robot then maximizes its expected performance in a new situation by either selecting a good existing model or requesting new demonstrations.

Compared to a standard implementation that encodes all demonstrations together for all test situations, the proposed approach offers four advantages.

First, a simply defined distance function can be used to estimate test performance by calculating the similarity between a test situation and the existing models.

Second, the proposed approach can improve generalization, e.g., better satisfying the demonstrated task constraints and speeding up task execution.

Third, because the hierarchical structure encodes each demonstrated situation individually, a wider range of task situations can be modeled in the same framework without deteriorating performance.

Last, adding or removing demonstrations incurs low computational load, and thus, the robot’s skill library can be built incrementally.

We first instantiate the proposed approach in a simulated task to validate these advantages.

We then show that the advantages transfer to real hardware for a task where naive participants collaborated with a Willow Garage PR2 robot to move a handheld object.

For most tested scenarios, our hierarchical method achieved significantly better task performance and subjective ratings than both a passive model with only gravity compensation and a single TP-GMM encoding all demonstrations.

American Psychological Association (APA)

Hu, Siyao& Kuchenbecker, Katherine J.. 2019. Hierarchical Task-Parameterized Learning from Demonstration for Collaborative Object Movement. Applied Bionics and Biomechanics،Vol. 2019, no. 2019, pp.1-25.
https://search.emarefa.net/detail/BIM-1114744

Modern Language Association (MLA)

Hu, Siyao& Kuchenbecker, Katherine J.. Hierarchical Task-Parameterized Learning from Demonstration for Collaborative Object Movement. Applied Bionics and Biomechanics No. 2019 (2019), pp.1-25.
https://search.emarefa.net/detail/BIM-1114744

American Medical Association (AMA)

Hu, Siyao& Kuchenbecker, Katherine J.. Hierarchical Task-Parameterized Learning from Demonstration for Collaborative Object Movement. Applied Bionics and Biomechanics. 2019. Vol. 2019, no. 2019, pp.1-25.
https://search.emarefa.net/detail/BIM-1114744

Data Type

Journal Articles

Language

English

Notes

Includes bibliographical references

Record ID

BIM-1114744