Data Transparency and Fairness Analysis of the NYPD Stop-and-Frisk Program

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


Given the increased concern of racial disparities in the stop-and-frisk programs, the New York Police Department (NYPD) requires publicly displaying detailed data for all the stops conducted by police authorities, including the suspected offense and race of the suspects. By adopting a public data transparency policy, it becomes possible to investigate racial biases in stop-and-frisk data and demonstrate the benefit of data transparency to approve or disapprove social beliefs and police practices. Thus, data transparency becomes a crucial need in the era of Artificial Intelligence (AI), where police and justice increasingly use different AI techniques not only to understand police practices but also to predict recidivism, crimes, and terrorism. In this study, we develop a predictive analytics method, including bias metrics and bias mitigation techniques to analyze the NYPD Stop-and-Frisk datasets and discover whether underline bias patterns are responsible for stops and arrests. In addition, we perform a fairness analysis on two protected attributes, namely, the race and the gender, and investigate their impacts on arrest decisions. We also apply bias mitigation techniques. The experimental results show that the NYPD Stop-and-Frisk dataset is not biased toward colored and Hispanic individuals and thus law enforcement authorities can apply the bias predictive analytics method to inculcate more fair decisions before making any arrests.

Original languageEnglish (US)
Article number7
JournalJournal of Data and Information Quality
Issue number2
StatePublished - Jun 2022

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Information Systems and Management


Dive into the research topics of 'Data Transparency and Fairness Analysis of the NYPD Stop-and-Frisk Program'. Together they form a unique fingerprint.

Cite this