Yu Shi, Zongliang Fu, Shuo Chen, Bohan Zhao, Wei Xu, Changshui Zhang, Jian Li
There has been a growing wave of work on foundation models for time series, with NeurIPS 2025 even hosting a workshop on the topic. The hope is that, as with language models, one can train a large autoregressive model on a huge corpus of time-series data to obtain a foundation model that can be used zero-shot or lightly fine-tuned for a wide variety of tasks across many domains, outperforming models trained purely on specific datasets or targets.
As a step towards this, Kronos is a foundation model for financial K-line time series (open, close, high, low, volume, etc.) trained across many asset classes, exchanges, products, frequencies and years.
The approach uses an autoencoder to encode each K-line element into a binary code (split into coarse and fine parts) and a Transformer to predict the next binary code (coarse first, then fine), which is then decoded back into a K-line element. As usual in this type of work, the model achieves state-of-the-art results on several public benchmarks, including those relating to price-forecasting and data generation.
While a generic foundation model is unlikely to be competitive with the task-specific or input-specific models we train internally at G-Research, it would be unwise to ignore the techniques others are finding useful on financial data.
Kronos: A Foundation Model for the Language of Financial Markets