CDS Spotlight: The Center for Responsible AI
At the Center for Data Science we believe in the immense potential of AI to create positive change in the world and in our everyday lives. In fact, many of our past and present community members are already working to create that change, be it through creating systems for breast cancer detection or the prediction of deterioration in hospitalized COVID-19 patients. However, many in the data science field have valid concerns about how we as humans might be building AI that will ultimately advance society’s structural inequities.
Many departments across NYU, including CDS, have put forth efforts to encourage ethical practices in AI but until recently there was no dedicated space for these departments’ efforts to come together and coalesce. Last year that changed with the formation of the Center for Responsible AI (R/AI). Led by CDS faculty Julia Stoyanovich, R/AI aims to “be a comprehensive applied research and tool production laboratory for accelerating responsible AI practices that arise from real world collaborations.”
Though Professor Stoyanovich created R/AI, she has a very unique vision for its future, in that, one day she hopes it no longer exists. “The mission of R/AI is to make itself obsolete in a couple of years,” Professor Stoyanovich told us recently in an interview. Her hope is that through the efforts of the Center and efforts taking place across the globe, “‘responsible AI and just plain AI become synonymous. Meaning that whenever anybody engages in the design or deployment or oversight of a system that uses artificial intelligence, they should be aware of the social, political, and legal context in which this system is used.”
But how do we make responsible AI the default? R/AI is hard at work on projects that aim to do just that, including a rich catalog of educational offerings. This includes “Responsible Data Science”, a technical course for future practitioners offered by CDS and taught by Professor Stoyanovich. The course sets out to give students an understanding of a long list of issues including ethics, legal compliance, data quality, algorithmic fairness and diversity, transparency of data and algorithms, and privacy and data protection — all of which are crucial to developing AI responsibly.
But R/AI isn’t just focused on changing the minds of people who are professionally involved in AI. It’s just as interested in educating the general public on what responsible AI means. To this end, R/AI has launched the Data, Responsibly comic book series that aims to teach basic AI concepts using humor and imagery. After the series’ launch last year, a second volume of the comic has been released. Titled “Friends and Fairness,” the volume covers the topics of algorithms, automated decision systems, and bias, connecting work on algorithmic fairness to concepts from political philosophy. R/AI also will also be launching a course with Peer 2 Peer University and the Queens Public Library called “We Are AI” that will be covering these topics and more!
Despite all of these amazing efforts, a lot of people still might be asking themselves why we should care so much about ethics in AI. Professor Stoyanovich puts it quite simply:
“Because AI is used ubiquitously, in domains that are beyond the scientific, these technologies impact the lives and livelihoods of all of us … When we look for things to buy on Amazon, for movies to watch, or for restaurants to order from — all of these recommendations are powered by AI. If done right, this is a good thing but unfortunately these days it’s the Wild West. Society is asking AI to do things that it’s not meant to do. We’re asking it to exercise discretion, have a sense of humor, and really behave like a person … This is why we, humans, have to bring our values and beliefs to the context of use of these systems, to exercise agency and gain control over our AI-enabled world. This is why every single one of us, professionally but also personally, has to care about the ethics of AI. Because AI is here to stay.”
More information about R/AI can be found on their website.