TY - GEN
T1 - Toward Smart Internet of Things (IoT) Devices
T2 - 2020 IEEE Canadian Conference on Electrical and Computer Engineering, CCECE 2020
AU - Abdallah, Abdallah S.
AU - Elliott, Lisa J.
AU - Donley, Daniel
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/8/30
Y1 - 2020/8/30
N2 - A significant portion of the internet of things (IoT) devices will become reliable products in our daily life if and only if they are equipped with strong human computer interaction (HCI) technologies, specifically visual interaction with users through affective computing. One of the major challenges faced in affective computing is recognizing facial expressions and the true emotions behind them. Despite numerous studies performed, current detection systems are ineffective at correctly identifying facial expressions with reliable accuracy, especially in case of negative expressions. Several research projects attempted to extract the recognition process that humans follow to identify facial expressions in order to replicate in smart machines without a significant success. This paper describes our interdisciplinary project whose goal is to extract and define the recognition process that humans follow when identifying the facial expressions of others. We monitor this process by identifying and analyzing the regions of interest participants look at when they are shown static emotions samples under a specific experimental setup. This paper reports the current status of data collection, experimental setup, and initial data visualization.
AB - A significant portion of the internet of things (IoT) devices will become reliable products in our daily life if and only if they are equipped with strong human computer interaction (HCI) technologies, specifically visual interaction with users through affective computing. One of the major challenges faced in affective computing is recognizing facial expressions and the true emotions behind them. Despite numerous studies performed, current detection systems are ineffective at correctly identifying facial expressions with reliable accuracy, especially in case of negative expressions. Several research projects attempted to extract the recognition process that humans follow to identify facial expressions in order to replicate in smart machines without a significant success. This paper describes our interdisciplinary project whose goal is to extract and define the recognition process that humans follow when identifying the facial expressions of others. We monitor this process by identifying and analyzing the regions of interest participants look at when they are shown static emotions samples under a specific experimental setup. This paper reports the current status of data collection, experimental setup, and initial data visualization.
UR - http://www.scopus.com/inward/record.url?scp=85097819091&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85097819091&partnerID=8YFLogxK
U2 - 10.1109/CCECE47787.2020.9255696
DO - 10.1109/CCECE47787.2020.9255696
M3 - Conference contribution
AN - SCOPUS:85097819091
T3 - Canadian Conference on Electrical and Computer Engineering
BT - 2020 IEEE Canadian Conference on Electrical and Computer Engineering, CCECE 2020
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 30 August 2020 through 2 September 2020
ER -