The Impact Remediation Framework: CDS PhD Student Lucius Bynum on Discovering Optimal Intervention Policies to Improve Equity

NYU Center for Data Science
2 min readSep 23, 2021

Automated decision systems often come under fire because they exacerbate unfair and biased outcomes towards specific individuals and groups. One overarching theme exists in the attempt to resolve this disparity: defining and mitigating unfair discrimination that could occur because of an automated decision system. However, CDS Ph.D. student Lucius Bynum alongside CDS Professor Julia Stoyanovich and former CDS affiliated professor Joshua Loftus, define another approach in their paper, “Disaggregated Interventions to Reduce Inequality.”

Rather than focusing on unfairness within the model, they wanted to concentrate their efforts on remediating the disparity that already exists in the world with an intervention. This method is called impact remediation. The process to create the impact remediation framework involved measuring disparity, defining possible interventions, and then finding the optimal set of units on which to perform the previously mentioned intervention.

The impact remediation framework can accommodate a broad range of topics from a variety of different real-world problems. Given the versatility of such a framework, this research can significantly impact the way policy is implemented today.

To learn more, read their research at arxiv.org.

About the team:

Lucius Bynum is currently a PhD student at CDS. His current research interests center around the intersection between responsible data science and statistical theory and methodology e.g. in areas such as machine learning fairness and causal inference. Prior to joining CDS, he worked as a researcher at Pacific Northwest National Laboratory in their Applied Statistics and Computational Modeling group. Lucius holds a B.Sc. in Data Science from Harvey Mudd College.

Julia Stoyanovich is an Associate Professor at CDS. Her research focuses on responsible data management and analysis practices: on operationalizing fairness, diversity, transparency, and data protection in all stages of the data acquisition and processing lifecycle. She established the Data, Responsibly consortium, and served on the New York City Automated Decision Systems Task Force (by appointment by Mayor de Blasio).

Joshua Loftus is an Assistant Professor at the London School of Economics, and previously taught at NYU. His research focuses on common practices in machine learning and data science pipelines and addressing sources and types of error that have previously been overlooked. His peer reviewed research has been published in the Annals of Statistics, Biometrika, Advances in Neural Information Processing Systems (NeurIPS), and the International Conference on Machine Learning (ICML).

By Keerthana Manivasakan

--

--

NYU Center for Data Science

Official account of the Center for Data Science at NYU, home of the Undergraduate, Master’s, and Ph.D. programs in Data Science.