Skip to main content
We're enhancing our site and your experience, so please keep checking back as we evolve.
Back to News
NeurIPs Paper Reviews 2023 #6

NeurIPs Paper Reviews 2023 #6

23 January 2024
  • Quantitative Research

Our team of quantitative researchers have shared the most interesting research presented during workshops and seminars at NeurIPs 2023.

Discover the perspectives of quantitative analyst Rui, as she discusses her most compelling findings from the conference.

NeurIPs Booth 2022

Predict, Refine, Synthesize: Self-Guiding Diffusion Models for Probabilistic Time Series Forecasting

Marcel Kollovieh, Abdul Fatir Ansari, Michael Bohlke-Schneider, Jasper Zschiegner, Hao Wang, Yuyang Wang

The authors demonstrate how an unconditionally trained generative model can be just as useful as task specific conditional models for various tasks without needing to adapt its training process.

For this purpose, they introduce TSDiff, an unconditionally trained diffusion model for time series and show how it can be used to:

  1. Sample from a conditional distribution despite not being trained on it;
  2. Enhance predictions from other forecasting models by formulating forecast enhancement as a regularized optimization problem using its learned likelihood function;
  3. Provide better synthetic data for downstream forecasting than other time series based generative models.

The authors conduct experiments for each of these properties, comparing TSDiff’s performance against statistical and probabilistic models tailored to specific tasks. The results consistently demonstrate that TSDiff performs at least as well as, if not better than, the selected alternative models for each respective task.

Predict, Refine, Synthesize: Self-Guiding Diffusion Models for Probabilistic Time Series Forecasting
NeurIPS 2022 Paper Reviews

Read paper reviews from NeurIPS 2022 from a number of our quantitative researchers and machine learning practitioners.

Read now

Conformal Prediction for Time Series with Modern Hopfield Networks

Andreas Auer, Martin Gauch, Daniel Klotz, Sepp Hochreiter

Conformal prediction provides distribution- and model-agnostic ways to construct prediction intervals using quantiles of a weighted error distribution from past predictions. However, it assumes the data to be exchangeable (order of observation doesn’t matter), which is violated by time series data with temporal autocorrelation. This paper presents a framework called HopCPT to provide valid prediction intervals with conformal prediction in this scenario.

Given a forecasting model for time series, the idea is that samples with temporal dependencies would have similar prediction errors (regimes). Hence the authors train a modern Hopfield network on the prediction errors to use its associative memory to retrieve past errors from a similar regime and weigh them using trained network weights to construct the prediction interval for a new sample.

The authors demonstrate the efficiency of HopCPT prediction intervals against other CP approaches on different forecasting models and time series.

Conformal Prediction for Time Series with Modern Hopfield Networks

Quantitative Research and Machine Learning

Want to learn more about life as a researcher at G-Research?

Learn more

Read more of our quantitative researchers thoughts

NeurIPs Paper Reviews 2023 #1

Discover the perspectives of Danny, one of our machine learning engineers, on the following papers:

  • A U-turn on Double Descent: Rethinking Parameter Counting in Statistical Learning
  • Normalization Layers Are All That Sharpness-Aware Minimization Needs
Paper Review #1
NeurIPs Paper Reviews 2023 #2

Discover the perspectives of Paul, one of our quantitative researchers, on the following papers:

  • Sharpness-Aware Minimization Leads to Low-Rank Features
  • When Do Neural Nets Outperform Boosted Trees on Tabular Data?
Paper Review #2
NeurIPs Paper Reviews 2023 #3

Discover the perspectives of Szymon, one of our quantitative researchers, on the following papers:

  • Convolutional State Space Models for Long-Range Spatiotemporal Modeling
  • How to Scale Your EMA
Paper Review #3
NeurIPS Paper Review 2023 #4

Discover the perspectives of Dustin, our scientific director, on the following papers:

  • Abide by the law and follow the flow: conservation laws for gradient flows
  • The Tunnel Effect: Building Data Representations in Deep Neural Networks
Paper Review #4
NeurIPS Paper Review 2023 #5

Discover the perspectives of Laurynas, one of our machine learning engineers, on the following papers:

  • Monarch Mixer: A Simple Sub-Quadratic GEMM-Based Architecture
  • QLoRA: Efficient Finetuning of Quantized LLMs
Paper Review #5

Latest News

Building Modern, Resilient and Reliant Infrastructure
  • 15 Jul 2024

In this video, Philip Bullock, US IaaS Lead, shares insights into our approach to infrastructure innovation. He delves into the immense compute power that underpins G-Research and explains how we ensure we stay at the cutting edge of technology and finance.

Read article
Celebrating Our New Dallas Home
  • 11 Jul 2024

Our new office at One Victory Commons, Dallas, exemplifies our commitment to excellence, providing a state-of-the-art setting that truly reflects the cutting-edge work we do.

Read article
An interview with Michael Kagan (CTO at NVIDIA)
  • 02 Jul 2024

We spoke to Michael Kagan, CTO at NVIDIA, in an exclusive interview shot ahead of his keynote talk at the G-Research Distinguished Speaker Symposium.

Read article

Latest Events

  • Quantitative Engineering
  • Software Engineering

Jobs for Mathematicians Fair (University of Oxford)

19 Nov 2024 Mathematical Institute, Radcliffe Observatory, Andrew Wiles Building, Woodstock Road, Oxford, OX2 6GG,
  • Quantitative Engineering
  • Software Engineering

University of Cambridge: Maths and Quants Fair

30 Oct 2024 Student Services Centre, New Museum Site, Bene't St, Cambridge CB2 3PT
  • Quantitative Engineering
  • Software Engineering

The SEC Engineering Career Fair

04 Sep 2024 Legends Event Center, 2533 Midtown Pk Blvd, Bryan, TX 77801, United States

Stay up to date with
G-Research