Effectively modeling time series with simple discrete state spaces

M Zhang, KK Saab, M Poli, T Dao, K Goel… - arXiv preprint arXiv …, 2023 - arxiv.org
arXiv preprint arXiv:2303.09489, 2023arxiv.org
Time series modeling is a well-established problem, which often requires that methods (1)
expressively represent complicated dependencies,(2) forecast long horizons, and (3)
efficiently train over long sequences. State-space models (SSMs) are classical models for
time series, and prior works combine SSMs with deep learning layers for efficient sequence
modeling. However, we find fundamental limitations with these prior approaches, proving
their SSM representations cannot express autoregressive time series processes. We thus …
Time series modeling is a well-established problem, which often requires that methods (1) expressively represent complicated dependencies, (2) forecast long horizons, and (3) efficiently train over long sequences. State-space models (SSMs) are classical models for time series, and prior works combine SSMs with deep learning layers for efficient sequence modeling. However, we find fundamental limitations with these prior approaches, proving their SSM representations cannot express autoregressive time series processes. We thus introduce SpaceTime, a new state-space time series architecture that improves all three criteria. For expressivity, we propose a new SSM parameterization based on the companion matrix -- a canonical representation for discrete-time processes -- which enables SpaceTime's SSM layers to learn desirable autoregressive processes. For long horizon forecasting, we introduce a "closed-loop" variation of the companion SSM, which enables SpaceTime to predict many future time-steps by generating its own layer-wise inputs. For efficient training and inference, we introduce an algorithm that reduces the memory and compute of a forward pass with the companion matrix. With sequence length and state-space size , we go from na\"ively to . In experiments, our contributions lead to state-of-the-art results on extensive and diverse benchmarks, with best or second-best AUROC on 6 / 7 ECG and speech time series classification, and best MSE on 14 / 16 Informer forecasting tasks. Furthermore, we find SpaceTime (1) fits AR() processes that prior deep SSMs fail on, (2) forecasts notably more accurately on longer horizons than prior state-of-the-art, and (3) speeds up training on real-world ETTh1 data by 73% and 80% relative wall-clock time over Transformers and LSTMs.
arxiv.org