TY - GEN
T1 - From mining affective states to mining facial keypoint data
T2 - ASME 2017 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, IDETC/CIE 2017
AU - Lopez, Christian E.
AU - Tucker, Conrad S.
N1 - Publisher Copyright:
© 2017 ASME.
PY - 2017
Y1 - 2017
N2 - Personalized and timely feedback has the potential to improve an individual's performance on a wide variety of engineering tasks. The ability to capture an individual's affective state(s) and performance on a task is a key component needed to advance personalization of feedback. While automated methods exist for quantifying task performance, the ability to quantify an individual's affective state(s) remains an open research area. Existing methods for quantifying an individual's affective state(s) are challenging to implement where real-Time assessment is needed (e.g., engineering workshop environments). This has sparked a growing interest for automated systems capable of inferring individuals' affective state(s), based on their projected facial or body cues. However, existing methods attempt to employ a general model to label an individual's affective state(s) into discrete categories, such as fear, joy, surprise, etc. Nonetheless, emotional expressions are far more complex, as individual differences in facial expressions, may deteriorate the performance of these systems in providing personalized feedback. To overcome these limitations, this work proposes a machine learning method for predicting an individual's performance on a task by utilizing his/her unique facial keypoint data, hereby bypassing the need to infer his/her discrete affective states. A case study involving 31 participants is presented. The support vector machine model employed to predict an individual's performance yielded an accuracy of 77.15% for an individual-Task specific model. In contrast, a general model yielded an accuracy of only 52.69%, hereby supporting the authors' argument that individual-Task specific models are more suitable for advancing personalized feedback.
AB - Personalized and timely feedback has the potential to improve an individual's performance on a wide variety of engineering tasks. The ability to capture an individual's affective state(s) and performance on a task is a key component needed to advance personalization of feedback. While automated methods exist for quantifying task performance, the ability to quantify an individual's affective state(s) remains an open research area. Existing methods for quantifying an individual's affective state(s) are challenging to implement where real-Time assessment is needed (e.g., engineering workshop environments). This has sparked a growing interest for automated systems capable of inferring individuals' affective state(s), based on their projected facial or body cues. However, existing methods attempt to employ a general model to label an individual's affective state(s) into discrete categories, such as fear, joy, surprise, etc. Nonetheless, emotional expressions are far more complex, as individual differences in facial expressions, may deteriorate the performance of these systems in providing personalized feedback. To overcome these limitations, this work proposes a machine learning method for predicting an individual's performance on a task by utilizing his/her unique facial keypoint data, hereby bypassing the need to infer his/her discrete affective states. A case study involving 31 participants is presented. The support vector machine model employed to predict an individual's performance yielded an accuracy of 77.15% for an individual-Task specific model. In contrast, a general model yielded an accuracy of only 52.69%, hereby supporting the authors' argument that individual-Task specific models are more suitable for advancing personalized feedback.
UR - http://www.scopus.com/inward/record.url?scp=85034768091&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85034768091&partnerID=8YFLogxK
U2 - 10.1115/DETC2017-67340
DO - 10.1115/DETC2017-67340
M3 - Conference contribution
AN - SCOPUS:85034768091
T3 - Proceedings of the ASME Design Engineering Technical Conference
BT - 37th Computers and Information in Engineering Conference
PB - American Society of Mechanical Engineers (ASME)
Y2 - 6 August 2017 through 9 August 2017
ER -