Project Details
Description
The project goal is to unlock the wealth of information about human expression that is already found in videos on the internet. The multidisciplinary project team will collect videos of human movement available online and use experts in movement analysis and non-experts to pinpoint at characteristics of the human movement that can be used to drive algorithms that will attempt to classify the emotion expressed by the human mover. These characteristics will form labels on the data that include context, demographics, technical concepts from movement analysis, and emotion. This work will take an unprecedented, multidisciplinary approach in creating a data infrastructure for computational modeling of bodily expression of emotion. To ensure the infrastructure's compatibility with human-robot interaction research, the team will conduct a public-facing feasibility study. The team will also employ advisory boards and continue to engage with active researchers in multiple sub-disciplines of the computer and information science and engineering research community in the designing, creation, testing, and dissemination of the data infrastructure, and organizing annual user community workshops and benchmarking challenges. The data infrastructure is expected to promote technological innovations and breakthroughs in data-driven modeling of human bodily expression of emotion and affect, a highly complex problem with applications in healthcare, e.g., caregiving robots and diagnostic tools for mental health, manufacturing, e.g., socially-aware autonomous forklifts and safety monitoring systems, security, e.g., monitoring, and consumer electronics, e.g., improved interactions with a home robot.Bodily movement expresses important information, including conveying emotion, which is crucial for future human-machine interactions. As in other areas of artificial intelligence (AI), such as image recognition, a large-scale data-driven approach holds promise for revealing new insights into the complex, subtle, and contextual nature of human bodily expression. However, research on computational recognition of bodily expression, an area of affective computing, AI, and human-robot interaction, is struggling to mature as researchers must replicate many of the same work-intensive steps, creating divergent efforts and expense. This NSF project aims to create a large-scale, high-quality, multifaceted, annotated, open, and extensible data infrastructure for computational understanding of human bodily expressions in a variety of settings. It will leverage the team's expertise in AI, computer vision, affective computing, expressive robotics, emotion recognition, psychology, movement analysis, statistics and data mining, data ethics, and the arts to create (1) a data-sharing infrastructure tailored to the needs of research into subjective experience, emotion, and bodily movement, (2) a crowdsourced annotated video dataset, and (3) a collection of tools and software for rigorous reliability validation, reproducibility and transparency assessment, and content-based search and retrieval. The data infrastructure is expected to serve applications in fields such as robotics, psychology, performing arts, animation, and entertainment. The project also develops human expertise in this emerging field by supporting graduate and undergraduate students, including students from underrepresented groups, providing experience in conducting infrastructure development, integrating knowledge from multiple disciplines. These students will interact regularly with the team’s international partners. Public events that create broad public engagement in the work will focus on numerous applications to human-robot interaction. The infrastructure will stimulate focused research projects and agendas in affective computing, AI, including artificial emotional intelligence and human-AI interaction, computer vision, social/assistive robotics, virtual agents, psychiatric telemedicine, human-centered design, machine/deep learning, ethics in computing, and related communities.This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Status | Active |
---|---|
Effective start/end date | 4/15/23 → 3/31/26 |
Funding
- National Science Foundation: $1,832,335.00
Fingerprint
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.