Abstract

A foundation model is a machine learning model trained on a large and diverse set of data, typically using self-supervised learning-based pre-training techniques, that can be adapted to various downstream tasks. However, current research on time series pre-training has predominantly focused on models trained exclusively on data from a single domain. As a result, these models possess domain-specific knowledge that may not be easily transferable to time series from other domains. In this paper, we aim to develop an effective time series foundation model by leveraging unlabeled samples from multiple domains. To achieve this, we repurposed the publicly available UCR Archive and evaluated four existing self-supervised learning-based pre-training methods, along with a novel method, on the datasets. We tested these methods using four popular neural network architectures for time series to understand how the pre-training methods interact with different network designs. Our experimental results show that pre-training improves downstream classification tasks by enhancing the convergence of the fine-tuning process. Furthermore, we found that the proposed pre-training method, when combined with the Transformer, outperforms the alternatives. The proposed method outperforms or achieves equal performance compared to the second best method in ~93% of downstream tasks.

Keywords:
Computer science Machine learning Artificial intelligence Time series Transformer Process (computing) Downstream (manufacturing) Artificial neural network Training set Data mining Set (abstract data type) Series (stratigraphy)

Metrics

14
Cited By
3.76
FWCI (Field Weighted Citation Impact)
18
Refs
0.93
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Time Series Analysis and Forecasting
Physical Sciences →  Computer Science →  Signal Processing
Anomaly Detection Techniques and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Data Stream Mining Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

ChatTime: A Unified Multimodal Time Series Foundation Model Bridging Numerical and Textual Data

Chengsen WangQi QiJingyu WangHaifeng SunZirui ZhuangJinming WuLei ZhangJianxin Liao

Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Year: 2025 Vol: 39 (12)Pages: 12694-12702
JOURNAL ARTICLE

Crude oil risk forecasting using time series foundation model

Kaijian HeLean YuYingchao Zou

Journal:   Procedia Computer Science Year: 2025 Vol: 266 Pages: 578-586
© 2026 ScienceGate Book Chapters — All rights reserved.