Synthetic data and ELSI-focused computational checklists—A survey of biomedical professionals’ views

Jennifer K. Wagner, Laura Y. Cabrera, Sara Gerke, Daniel Susser

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Artificial intelligence (AI) and machine learning (ML) tools are now proliferating in biomedical contexts, and there is no sign this will slow down any time soon. AI/ML and related technologies promise to improve scientific understanding of health and disease and have the potential to spur the development of innovative and effective diagnostics, treatments, cures, and medical technologies. Concerns about AI/ML are prominent, but attention to two specific aspects of AI/ML have so far received little research attention: synthetic data and computational checklists that might promote not only the reproducibility of AI/ML tools but also increased attention to ethical, legal, and social implications (ELSI) of AI/ML tools. We administered a targeted survey to explore these two items among biomedical professionals in the United States. Our survey findings suggest that there is a gap in familiarity with both synthetic data and computational checklists among AI/ML users and developers and those in ethics-related positions who might be tasked with ensuring the proper use or oversight of AI/ ML tools. The findings from this survey study underscore the need for additional ELSI research on synthetic data and computational checklists to inform escalating efforts, including the establishment of laws and policies, to ensure safe, effective, and ethical use of AI in health settings.

Original languageEnglish (US)
Article numbere0000666
JournalPLOS Digital Health
Volume3
Issue number11
DOIs
StatePublished - Nov 2024

All Science Journal Classification (ASJC) codes

  • Health Informatics

Fingerprint

Dive into the research topics of 'Synthetic data and ELSI-focused computational checklists—A survey of biomedical professionals’ views'. Together they form a unique fingerprint.

Cite this