TY - GEN
T1 - Bioinspired Dynamic Affect-Based Motion Control of a Humanoid Robot to Collaborate with Human in Manufacturing
AU - Rahman, S. M.Mizanoor
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/6
Y1 - 2019/6
N2 - Affect-based intelligent motion control for human-robot collaborative assembly in manufacturing was developed, and the effects of the dynamic affect-based control on human-robot collaboration (HRC) and assembly performance were investigated. An anthropomorphic robot with affect display ability was used to collaborate with a human in an assembly task where the human and the robot collaboratively assembled three parts. Firstly, in order to receive bioinspiration, the affective features in a human-human collaborative assembly task were studied. Secondly, based on the affective features of humans, an affect-based intelligent motion control strategy for the robot was proposed so that the robot could dynamically adjust its affective states like humans with changes in task situations during the human-robot collaborative assembly. The proposed affect-based motion control was experimentally evaluated for HRC and assembly performance, and the results were compared with that when the robot collaborated with its human counterpart with no affect and a static affect displays. The results showed that the static affect produced better HRC and assembly performance than that the robot produced with no affect. However, the dynamic affect produced significantly better HRC and assembly performance than that the static and the no affect displays produced. The results encourage to employ anthropomorphic robots with dynamic affect-based motion control strategies to collaborate with humans in manufacturing to improve HRC and manufacturing performance.
AB - Affect-based intelligent motion control for human-robot collaborative assembly in manufacturing was developed, and the effects of the dynamic affect-based control on human-robot collaboration (HRC) and assembly performance were investigated. An anthropomorphic robot with affect display ability was used to collaborate with a human in an assembly task where the human and the robot collaboratively assembled three parts. Firstly, in order to receive bioinspiration, the affective features in a human-human collaborative assembly task were studied. Secondly, based on the affective features of humans, an affect-based intelligent motion control strategy for the robot was proposed so that the robot could dynamically adjust its affective states like humans with changes in task situations during the human-robot collaborative assembly. The proposed affect-based motion control was experimentally evaluated for HRC and assembly performance, and the results were compared with that when the robot collaborated with its human counterpart with no affect and a static affect displays. The results showed that the static affect produced better HRC and assembly performance than that the robot produced with no affect. However, the dynamic affect produced significantly better HRC and assembly performance than that the static and the no affect displays produced. The results encourage to employ anthropomorphic robots with dynamic affect-based motion control strategies to collaborate with humans in manufacturing to improve HRC and manufacturing performance.
UR - http://www.scopus.com/inward/record.url?scp=85077788404&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85077788404&partnerID=8YFLogxK
U2 - 10.1109/HSI47298.2019.8942609
DO - 10.1109/HSI47298.2019.8942609
M3 - Conference contribution
AN - SCOPUS:85077788404
T3 - International Conference on Human System Interaction, HSI
SP - 76
EP - 81
BT - Proceedings - 2019 12th International Conference on Human System Interaction, HSI 2019
PB - IEEE Computer Society
T2 - 12th International Conference on Human System Interaction, HSI 2019
Y2 - 25 June 2019 through 26 June 2019
ER -