CDS Shines at NeurIPS 2023
In the world of data science, few events garner as much attention and excitement as the annual Neural Information Processing Systems (NeurIPS) conference. 2023’s event, held in New Orleans in December, was no exception, showcasing groundbreaking research from around the globe. At the heart of these innovations stood the faculty of CDS, demonstrating their pivotal role in advancing the field.
In the lead-up to the conference, we put together a comprehensive list of NeurIPS papers authored by CDS members (faculty, researchers, and PhD students) Now that the conference is over, we want to highlight our Leadership and Joint Faculty’s contributions, which spanned a wide array of topics, from machine learning algorithms to neural science, and underlined the center’s multidisciplinary and cutting-edge research approach.
Andrew Wilson (Associate Professor of Computer Science and Data Science)
- “A Performance-Driven Benchmark for Feature Selection in Tabular Deep Learning” by Valeriia Cherepanova, Roman Levin, Gowthami Somepalli, Jonas Geiping, C. Bayan Bruss, Andrew Wilson, Tom Goldstein, Micah Goldblum (Postdoc Researcher)
- “CoLA: Exploiting Compositional Structure for Automatic and Efficient Numerical Linear Algebra” by Andres Potapczynski (PhD student), Marc Finzi, Geoff Pleiss, Andrew Wilson
- “Battle of the Backbones: A Large-Scale Comparison of Pretrained Models across Computer Vision Tasks” by Micah Goldblum (Postdoc Researcher), Hossein Souri, Renkun Ni, Manli Shu, Viraj Prabhu, Gowthami Somepalli, Prithvijit Chattopadhyay, Mark Ibrahim, Adrien Bardes, Judy Hoffman, Rama Chellappa, Andrew Wilson, Tom Goldstein
- “Large Language Models Are Zero-Shot Time Series Forecasters” by Nate Gruver, Marc Finzi, Shikai Qiu, Andrew Wilson
- “Should We Learn Most Likely Functions or Parameters?” by Shikai Qiu, Tim G. J. Rudner (Faculty Fellow), Sanyam Kapoor (PhD student), Andrew Wilson
- “Simplifying Neural Network Training Under Class Imbalance” by Ravid Shwartz-Ziv (Faculty Fellow), Micah Goldblum (Postdoc Researcher), Yucen Li, C. Bayan Bruss, Andrew Wilson
- “Understanding the detrimental class-level effects of data augmentation” by Polina Kirichenko (PhD student), Mark Ibrahim, Randall Balestriero, Diane Bouchacourt, Shanmukha Ramakrishna Vedantam, Hamed Firooz, Andrew Wilson
- “Visual Explanations of Image-Text Representations via Multi-Modal Information Bottleneck Attribution” by Ying Wang, Tim G. J. Rudner (Instructor), Andrew Wilson
Joan Bruna (Associate Professor of Computer Science and Data Science)
- “A Neural Collapse Perspective on Feature Evolution in Graph Neural Networks” by Vignesh Kothapalli, Tom Tirer, Joan Bruna
- “Inverse Dynamics Pretraining Learns Good Representations for Multitask Imitation” by David Brandfonbrener, Ofir Nachum, Joan Bruna
- “On Single-Index Models beyond Gaussian Data” by Aaron Zweig, Loucas Pillaud-Vivien, Joan Bruna
Eero Simoncelli (Professor of Neural Science, Mathematics, Data Science, and Psychology)
- “A polar prediction model for learning to represent visual transformations” by Pierre-Étienne Fiquet, Eero Simoncelli
- “Adaptive whitening with fast gain modulation and slow synaptic plasticity” by Lyndon Duong, Eero Simoncelli, Dmitri Chklovskii, David Lipshutz
- “Learning Efficient Coding of Natural Images with Maximum Manifold Capacity Representations” by Thomas Yerxa, Yilun Kuang, Eero Simoncelli, SueYeon Chung (affiliated professor)
Yann LeCun (Professor of Computer Science, Neural Science, Data Science, and Electrical and Computer Engineering)
- “An Information Theory Perspective on Variance-Invariance-Covariance Regularization” by Ravid Shwartz-Ziv (Faculty Fellow), Randall Balestriero, Kenji Kawaguchi, Tim G. J. Rudner (Instructor), Yann LeCun (Professor of Computer Science, Neural Science, Data Science, and Electrical and Computer Engineering)
- “Reverse Engineering Self-Supervised Learning” by Ido Ben-Shaul, Ravid Shwartz-Ziv (Faculty Fellow), Tomer Galanti, Shai Dekel, Yann LeCun (Professor of Computer Science, Neural Science, Data Science, and Electrical and Computer Engineering)
- “Self-Supervised Learning with Lie Symmetries for Partial Differential Equations” by Grégoire Mialon, Quentin Garrido, Hannah Lawrence, Danyal Rehman, Yann LeCun (Professor of Computer Science, Neural Science, Data Science, and Electrical and Computer Engineering)
Kyunghyun Cho (Professor of Computer Science and Data Science)
- “AbDiffuser: full-atom generation of in-vitro functioning antibodies” by Karolis Martinkus, Jan Ludwiczak, Wei-Ching Liang, Julien Lafrance-Vanasse, Isidro Hotzel, Arvind Rajpal, Yan Wu, Kyunghyun Cho, Richard Bonneau, Vladimir Gligorijevic, Andreas Loukas
- “Protein Design with Guided Discrete Diffusion” by Nate Gruver, Samuel Stanton (PhD alumnus), Nathan Frey, Tim G. J. Rudner (Instructor), Isidro Hotzel, Julien Lafrance-Vanasse, Arvind Rajpal, Kyunghyun Cho, Andrew Wilson (Associate Professor of Computer Science and Data)
Qi Lei (Assistant Professor of Mathematics and Data Science)
- “Cluster-aware Semi-supervised Learning: Relational Knowledge Distillation Provably Learns Clustering” by Yijun Dong, Kevin Miller, Qi Lei, Rachel Ward
- “Sample Complexity for Quadratic Bandits: Hessian Dependent Bounds and Optimal Algorithms” by Qian Yu, Yining Wang, Baihe Huang, Qi Lei, Jason Lee
Cristina Savin (Assistant Professor of Neural Science and Data Science)
- “Formalizing locality for normative synaptic plasticity models” by Colin Bredenberg, Ezekiel Williams, Cristina Savin, Blake Richards, Guillaume Lajoie
Rajesh Ranganath (Assistant Professor of Computer Science and Data Science)
- “Don’t blame Dataset Shift! Shortcut Learning due to Gradients and Cross Entropy” by Aahlad Manas Puli, Lily Zhang (PhD student), Yoav Wald (Faculty Fellow), Rajesh Ranganath
He He (Assistant Professor of Computer Science & Data Science)
- “Testing the General Deductive Reasoning Capacity of Large Language Models Using OOD Examples” by Abulhair Saparov, Richard Yuanzhe Pang, Vishakh Padmakumar (PhD student), Nitish Joshi, Mehran Kazemi, Najoung Kim, He He
Jonathan Niles-Weed (Assistant Professor of Mathematics and Data Science)
- “The Adversarial Consistency of Surrogate Risks for Binary Classification” by Natalie Frank, Jonathan Niles-Weed
Yanjun Han (Assistant Professor of Mathematics and Data Science)
- “Learning and Collusion in Multi-unit Auctions” by Simina Branzei, Mahsa Derakhshan, Negin Golrezaei, Yanjun Han
Samuel Bowman (Associate Professor of Linguistics and Data Science)
- “Language Models Don’t Always Say What They Think: Unfaithful Explanations in Chain-of-Thought Prompting” by Miles Turpin (Junior Research Scientist), Julian Michael (Research Scientist), Ethan Perez, Samuel Bowman
In total, the CDS joint faculty contributed to an impressive array of research topics, showcasing not only their expertise but also their commitment to addressing some of the most challenging and significant questions in data science today.
By Stephen Thomas