ML is a fast-evolving discipline; attending conferences like NeurIPS and keeping up-to-date with the latest developments is key to the success of our quantitative researchers and machine learning engineers.
Our NeurIPS 2022 paper review series gives you the opportunity to hear about the research and papers that our quants and ML engineers found most interesting from the conference.
Follow the links to read each set of NeurIPS 2022 paper reviews.
Sebastian L – Quantitative Researcher
- Focal Modulation Networks
- Reconstructing Training Data from Trained Neural Networks
Simon L – Senior Quantitative Researcher
- Agreement-on-the-Line: Predicting the Performance of Neural Networks under Distribution Shift
- Deep Ensembles Work, But Are They Necessary?
Tom M – Machine Learning Engineer
- On the Symmetries of Deep Learning Models and their Internal Representations
- IBUG: Instance-Based Uncertainty Estimation for Gradient-Boosted Regression Trees
Hugh S – Senior Quantitative Researcher
- Posterior and Computational Uncertainty in Gaussian Processes
- Structural Kernel Search via Bayesian Optimization and Symbolical Optimal Transport
Leo M – Quantitative Researcher
- Beyond neural scaling laws: beating power law scaling via data pruning
- The curse of (non)convexity: The case of an Optimization-Inspired Data Pruning algorithm
Maria R – Quantitative Researcher
- Low-rank lottery tickets: finding efficient low-rank neural networks via matrix differential equations
- AUTOMATA: Gradient Based Data Subset Selection for Compute-Efficient Hyper-parameter Tuning
Maxime R – Senior Quantitative Researcher
- On the Parameterization and Initialization of Diagonal State Space Models
- Sharpness-Aware Training for Free
Vuk R – Quantitative Researcher
- Sheaf Attention Networks
- The Union of Manifolds Hypothesis