NeurIPS 2022: Research paper reviews

A number of our researchers attended NeurIPS in New Orleans to keep up-to-date with the latest in machine learning. Read their thoughts on some of the best papers from NeurIPS 2022.

ML is a fast-evolving discipline; attending conferences like NeurIPS and keeping up-to-date with the latest developments is key to the success of our quantitative researchers and machine learning engineers.

Our NeurIPS 2022 paper review series gives you the opportunity to hear about the research and papers that our quants and ML engineers found most interesting from the conference.

Follow the links to read each set of NeurIPS 2022 paper reviews.

Sebastian L – Quantitative Researcher

  • Focal Modulation Networks
  • Reconstructing Training Data from Trained Neural Networks

Read the NeurIPS paper review

Simon L – Senior Quantitative Researcher

  • Agreement-on-the-Line: Predicting the Performance of Neural Networks under Distribution Shift
  • Deep Ensembles Work, But Are They Necessary?

Read the NeurIPS paper review

Tom M – Machine Learning Engineer

  • On the Symmetries of Deep Learning Models and their Internal Representations
  • IBUG: Instance-Based Uncertainty Estimation for Gradient-Boosted Regression Trees

Read the NeurIPS paper review

Hugh S – Senior Quantitative Researcher

  • Posterior and Computational Uncertainty in Gaussian Processes
  • Structural Kernel Search via Bayesian Optimization and Symbolical Optimal Transport

Read the NeurIPS paper review

Leo M – Quantitative Researcher

  • Beyond neural scaling laws: beating power law scaling via data pruning
  • The curse of (non)convexity: The case of an Optimization-Inspired Data Pruning algorithm

Read the NeurIPS paper review

Maria R – Quantitative Researcher

  • Low-rank lottery tickets: finding efficient low-rank neural networks via matrix differential equations
  • AUTOMATA: Gradient Based Data Subset Selection for Compute-Efficient Hyper-parameter Tuning

Read the NeurIPS paper review

Maxime R – Senior Quantitative Researcher

  • On the Parameterization and Initialization of Diagonal State Space Models
  • Sharpness-Aware Training for Free

Read the NeurIPS paper review

Vuk R – Quantitative Researcher

  • Sheaf Attention Networks
  • The Union of Manifolds Hypothesis

Read the NeurIPS paper review