TY - GEN
T1 - Ergodic imitation
T2 - 2021 IEEE International Conference on Robotics and Automation, ICRA 2021
AU - Kalinowska, Aleksandra
AU - Prabhakar, Ahalya
AU - Fitzsimons, Kathleen
AU - Murphey, Todd
N1 - Funding Information:
*Authors contributed equally This material is based upon work supported by the NSF under Grant CNS 1837515. Any opinions, findings and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the aforementioned institutions. 1Mechanical Engineering, Northwestern University, Evanston, IL 2Physical Therapy and Human Movement Sciences, Northwestern University, Chicago, IL
Publisher Copyright:
© 2021 IEEE
PY - 2021
Y1 - 2021
N2 - With growing access to versatile robotics, it is beneficial for end users to be able to teach robots tasks without needing to code a control policy. One possibility is to teach the robot through successful task executions. However, near-optimal demonstrations of a task can be difficult to provide and even successful demonstrations can fail to capture task aspects key to robust skill replication. Here, we propose a learning from demonstration (LfD) approach that enables learning of robust task definitions without the need for near-optimal demonstrations. We present a novel algorithmic framework for learning tasks based on the ergodic metric-a measure of information content in motion. Moreover, we make use of negative demonstrations-demonstrations of what not to do-and show that they can help compensate for imperfect demonstrations, reduce the number of demonstrations needed, and highlight crucial task elements improving robot performance. In a proof-of-concept example of cart-pole inversion, we show that negative demonstrations alone can be sufficient to successfully learn and recreate a skill. Through a human subject study with 24 participants, we show that consistently more information about a task can be captured from combined positive and negative (posneg) demonstrations than from the same amount of just positive demonstrations. Finally, we demonstrate our learning approach on simulated tasks of target reaching and table cleaning with a 7-DoF Franka arm. Our results point towards a future with robust, data-efficient LfD for novice users.
AB - With growing access to versatile robotics, it is beneficial for end users to be able to teach robots tasks without needing to code a control policy. One possibility is to teach the robot through successful task executions. However, near-optimal demonstrations of a task can be difficult to provide and even successful demonstrations can fail to capture task aspects key to robust skill replication. Here, we propose a learning from demonstration (LfD) approach that enables learning of robust task definitions without the need for near-optimal demonstrations. We present a novel algorithmic framework for learning tasks based on the ergodic metric-a measure of information content in motion. Moreover, we make use of negative demonstrations-demonstrations of what not to do-and show that they can help compensate for imperfect demonstrations, reduce the number of demonstrations needed, and highlight crucial task elements improving robot performance. In a proof-of-concept example of cart-pole inversion, we show that negative demonstrations alone can be sufficient to successfully learn and recreate a skill. Through a human subject study with 24 participants, we show that consistently more information about a task can be captured from combined positive and negative (posneg) demonstrations than from the same amount of just positive demonstrations. Finally, we demonstrate our learning approach on simulated tasks of target reaching and table cleaning with a 7-DoF Franka arm. Our results point towards a future with robust, data-efficient LfD for novice users.
UR - http://www.scopus.com/inward/record.url?scp=85124795137&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85124795137&partnerID=8YFLogxK
U2 - 10.1109/ICRA48506.2021.9561746
DO - 10.1109/ICRA48506.2021.9561746
M3 - Conference contribution
AN - SCOPUS:85124795137
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 6293
EP - 6299
BT - 2021 IEEE International Conference on Robotics and Automation, ICRA 2021
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 30 May 2021 through 5 June 2021
ER -