Collaborative Research: Leveraging Crowd-AI Teams for Scalable Novelty Ratings of Heterogeneous Design Representations

Project: Research project

Project Details


This research project will advance the field of engineering design by developing a framework for rating the novelty and quality of design artifacts using a combination of human and machine expertise. The project will address a significant challenge in the field, which is how to accurately evaluate the vast amounts of complex, heterogeneous ideas generated in design studies. The project will contribute to understanding how to effectively leverage human and computational expertise in evaluating design artifacts, leading to improved product design. While focused on an engineering design use case, the project will contribute to machine learning (ML) research by providing evidence on what, when, and how to learn from humans when evaluating heterogeneous datasets. The project will establish a website with open-source data and workshops for early-career professionals to enhance the broader impacts of the work on the engineering design community at large. The research team will partner with the Women in Science and Engineering Research (WISER) program and the Multicultural Engineering Program (MEP) at Penn State to encourage participation of underrepresented groups in engineering.The project investigates three distinct means of evaluating heterogeneous design artifacts (CAD drawings, text, sketches, prototypes) against established design metrics at scale, namely (1) human expert designers and crowd-sourced human raters; (2) pure ML-based methods; and (3) expert designer-assisted ML methods. These methods will be used to better understand the impact of design representation on the reliability of crowd-based design ratings, establish and validate multi-modal representation learning methods for design evaluation, and create methods for deploying crowd-ML collaborations for scalable design evaluation.This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Effective start/end date7/1/236/30/26


  • National Science Foundation: $299,662.00


Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.