ICML 2023: Research paper reviews

A number of our researchers attended the International Conference for Machine Learning (ICML) in Hawaii, US, in July. Read their thoughts on some of the best papers from the conference.

ICML 2023 paper reviews

Machine Learning (ML) is a fast evolving discipline, which means conference attendance and hearing about the very latest research is key to the ongoing development and success of our quantitative researchers and ML engineers.

As such, we encourage our researchers to attend conferences and ICML 2023 was no different, with a number of our team descending on Hawaii for the latest instalment.

Follow the links to learn more about the papers our team found interesting, and the insight they drew from them.

Maria R – Quantitative Researcher

  • Learning to Maximize Mutual Information for Dynamic Feature Selection
  • Iterative Approximate Cross-Validation
  • “Why did the model fail?”: Attributing Model Performance Changes  to Distributional Shifts
  • Scaling up Dataset Distillation to ImageNet-1K with constant memory

Read the ICML 2023 paper review

Jonathan L – Quantitative Researcher

  • Resurrecting Recurrent Neural Networks for Long Sequences
  • Theoretical Guarantees of Learning Ensembling Strategies with Applications to Time Series Forecasting
  • Rockmate: an Efficient, Fast, Automatic and Generic Tool for Re-materialization in PyTorch

Read the ICML 2023 paper review

Casey H – Machine Learning Engineer

  • DetectGPT: Zero-Shot Machine-Generated Text Detection using Probability Curvature
  • Resurrecting Recurrent Neural Networks for Long Sequences
  • Git-Theta: A Git Extension for Collaborative Development of Machine Learning Models

Read the ICML 2023 paper review