Sitemap

CDS Professor Presents Research at ICML Conference

3 min readJul 28, 2020

We’re excited to announce that CDS assistant professor of data science (and assistant professor of computer science and mathematics at NYU’s Courant Institute) Joan Bruna has presented research at the ICML conference earlier this month alongside his students and collaborators.

Joan Bruna (courtesy of Joan Bruna)

Joan Bruna holds a Ph.D. in Applied Mathematics from Ecole Polytechnique, Palaiseau, France. He is also a co-founder of the MaD (Math and Data) group at CDS and Courant Institute, whose mission is to advance the mathematical and statistical foundations of data sciences, specializing in signal processing and inverse problems, machine learning and deep learning, and high-dimensional statistics and probability.

The ICML (International Conference on Machine Learning) is an assembly of dedicated professionals in the scientific community with the common goal of seeking to advance machine learning research. The conference recently took place virtually on July 12–18, 2020. ICML is globally recognized for publishing and presenting research on machine learning, particularly around the areas of statistics and data science, artificial intelligence, and significant application areas such as machine vision, speech recognition, computational biology, and robotics.

The research paper presented “Extragradient with player sampling for faster Nash equilibrium finding” analyzes “a new extra-gradient method for Nash equilibrium finding, that performs gradient extrapolations and updates on a random subset of players at each iteration.” The team proposes that “an additional variance reduction mechanism to obtain speed-ups in smooth convex games.” Their approach “makes extrapolation amenable to massive multiplayer settings, and brings empirical speed-ups, in particular when using a heuristic cyclic sampling scheme. Most importantly, it allows us to train faster and better GANs and mixtures of GANs.” — Extragradient with player sampling…

Paper’s Abstract

Data-driven modeling increasingly requires to find a Nash equilibrium in multi-player games, e.g. when training GANs. In this paper, we analyse a new extra-gradient method for Nash equilibrium finding, that performs gradient extrapolations and updates on a random subset of players at each iteration. This approach probably exhibits a better rate of convergence than full extra-gradient for non-smooth convex games with noisy gradient oracle. We propose an additional variance reduction mechanism to obtain speed-ups in smooth convex games. Our approach makes extrapolation amenable to massive multiplayer settings, and brings empirical speed-ups, in particular when using a heuristic cyclic sampling scheme. Most importantly, it allows to train faster and better GANs and mixtures of GANs.

The research project emerged when Joan, his students Carles Domingo and Samy Jelassi, and collaborators Damien Scieur and Arthur Mensch, found themselves interested in multiagent learning and game dynamics, specifically to advance the foundations of optimization algorithms and their interplay with non-linear function approximation.

The team ultimately set out to comprehend how to optimally use the so-called player extrapolation in multiagent optimization schemes. Such extrapolation anticipates each player’s next move and likely helps players converge to equilibria under appropriate settings, at the expense of added computational cost. Specifically, they focused on studying the robustness of mirror-proxy to ’noisy’ player extrapolation, which subsequently enables more efficient, sparse player extrapolation.

When it comes to the future of the project, it’s an ongoing research effort in which the group will continue to study theoretical foundations of multiagent learning. This setup presents key differences relative to the classic single-agent based learning and the group will continue to attempt to connect these differences to the broader theoretical foundations of deep learning.

To learn more about the paper, please view in its entirety here. More information about ICML can be found on the conference website.

By Ashley C. McDonald

--

--

NYU Center for Data Science
NYU Center for Data Science

Written by NYU Center for Data Science

Official account of the Center for Data Science at NYU, home of the Undergraduate, Master’s, and Ph.D. programs in Data Science.

No responses yet