Skip to main content
We're enhancing our site and your experience, so please keep checking back as we evolve.
Back to News
NeurIPs Paper Reviews 2023 #6

NeurIPs Paper Reviews 2023 #6

23 January 2024
  • Quantitative Research

Our team of quantitative researchers have shared the most interesting research presented during workshops and seminars at NeurIPs 2023.

Discover the perspectives of quantitative analyst Rui, as she discusses her most compelling findings from the conference.

NeurIPs Booth 2022

Predict, Refine, Synthesize: Self-Guiding Diffusion Models for Probabilistic Time Series Forecasting

Marcel Kollovieh, Abdul Fatir Ansari, Michael Bohlke-Schneider, Jasper Zschiegner, Hao Wang, Yuyang Wang

The authors demonstrate how an unconditionally trained generative model can be just as useful as task specific conditional models for various tasks without needing to adapt its training process.

For this purpose, they introduce TSDiff, an unconditionally trained diffusion model for time series and show how it can be used to:

  1. Sample from a conditional distribution despite not being trained on it;
  2. Enhance predictions from other forecasting models by formulating forecast enhancement as a regularized optimization problem using its learned likelihood function;
  3. Provide better synthetic data for downstream forecasting than other time series based generative models.

The authors conduct experiments for each of these properties, comparing TSDiff’s performance against statistical and probabilistic models tailored to specific tasks. The results consistently demonstrate that TSDiff performs at least as well as, if not better than, the selected alternative models for each respective task.

Predict, Refine, Synthesize: Self-Guiding Diffusion Models for Probabilistic Time Series Forecasting
NeurIPS 2022 Paper Reviews

Read paper reviews from NeurIPS 2022 from a number of our quantitative researchers and machine learning practitioners.

Read now

Conformal Prediction for Time Series with Modern Hopfield Networks

Andreas Auer, Martin Gauch, Daniel Klotz, Sepp Hochreiter

Conformal prediction provides distribution- and model-agnostic ways to construct prediction intervals using quantiles of a weighted error distribution from past predictions. However, it assumes the data to be exchangeable (order of observation doesn’t matter), which is violated by time series data with temporal autocorrelation. This paper presents a framework called HopCPT to provide valid prediction intervals with conformal prediction in this scenario.

Given a forecasting model for time series, the idea is that samples with temporal dependencies would have similar prediction errors (regimes). Hence the authors train a modern Hopfield network on the prediction errors to use its associative memory to retrieve past errors from a similar regime and weigh them using trained network weights to construct the prediction interval for a new sample.

The authors demonstrate the efficiency of HopCPT prediction intervals against other CP approaches on different forecasting models and time series.

Conformal Prediction for Time Series with Modern Hopfield Networks

Quantitative Research and Machine Learning

Want to learn more about life as a researcher at G-Research?

Learn more

Read more of our quantitative researchers thoughts

NeurIPs Paper Reviews 2023 #1

Discover the perspectives of Danny, one of our machine learning engineers, on the following papers:

  • A U-turn on Double Descent: Rethinking Parameter Counting in Statistical Learning
  • Normalization Layers Are All That Sharpness-Aware Minimization Needs
Paper Review #1
NeurIPs Paper Reviews 2023 #2

Discover the perspectives of Paul, one of our quantitative researchers, on the following papers:

  • Sharpness-Aware Minimization Leads to Low-Rank Features
  • When Do Neural Nets Outperform Boosted Trees on Tabular Data?
Paper Review #2
NeurIPs Paper Reviews 2023 #3

Discover the perspectives of Szymon, one of our quantitative researchers, on the following papers:

  • Convolutional State Space Models for Long-Range Spatiotemporal Modeling
  • How to Scale Your EMA
Paper Review #3
NeurIPS Paper Review 2023 #4

Discover the perspectives of Dustin, our scientific director, on the following papers:

  • Abide by the law and follow the flow: conservation laws for gradient flows
  • The Tunnel Effect: Building Data Representations in Deep Neural Networks
Paper Review #4
NeurIPS Paper Review 2023 #5

Discover the perspectives of Laurynas, one of our machine learning engineers, on the following papers:

  • Monarch Mixer: A Simple Sub-Quadratic GEMM-Based Architecture
  • QLoRA: Efficient Finetuning of Quantized LLMs
Paper Review #5

Latest News

G Research
G-Research September 2024 Grant Winners
  • 08 Oct 2024

Each month, we provide up to £2,000 in grant money to early career researchers in quantitative disciplines. Hear from our August grant winners.

Read article
Lessons learned: Delivering software programs (part 5)
  • 04 Oct 2024

Hear more from our Head of Forecasting Engineering and learn how to keep your projects on track by embracing constant change and acting quickly.

Read article

Latest Events

  • Quantitative Engineering
  • Quantitative Research

Oxford Coding Challenge

23 Oct 2024 University of Oxford, Computer Science Lecture Theatre A, 7 Parks Rd, Oxford, OX1 3QG
  • Quantitative Engineering
  • Quantitative Research

Cambridge Coding Challenge

28 Oct 2024 East Hub 1, University of Cambridge, JJ Thomson Avenue, Cambridge, CB3 0US
  • Quantitative Engineering
  • Quantitative Research

Cambridge Quant Challenge

06 Nov 2024 University of Cambridge, Centre for Mathematical Sciences,  Wilberforce Road,  Cambridge CB3 0WA

Stay up to date with
G-Research