论文标题

提示:时间序列预测的新的基于及时的学习范例

PromptCast: A New Prompt-based Learning Paradigm for Time Series Forecasting

论文作者

Xue, Hao, Salim, Flora D.

论文摘要

本文介绍了时间序列预测的新观点。在现有的时间序列预测方法中,模型将数值值序列作为输入,并产生数值作为输出。现有的SOTA模型主要基于变压器体系结构,并使用多种编码机制进行了修改,以结合围绕历史数据的上下文和语义。受到预训练的语言基础模型的成功的启发,我们提出了一个问题,即这些模型是否也可以适应解决时间序列的预测。因此,我们提出了一个新的预测范式:基于及时的时间序列预测(提示)。在这项新颖的任务中,数值输入和输出被转换为提示,并以句子到句子的方式构建了预测任务,从而可以直接将语言模型应用于预测目的。为了支持和促进该任务的研究,我们还提出了一个大规模数据集(PISA),其中包括三个现实世界的预测场景。我们评估了不同的基于SOTA数值的预测方法和语言生成模型。具有各种预测设置的基准结果表明,具有语言生成模型的提示是一个有希望的研究方向。此外,与传统的基于数值的预测相比,提示cast在零拍设置下显示出更好的概括能力。

This paper presents a new perspective on time series forecasting. In existing time series forecasting methods, the models take a sequence of numerical values as input and yield numerical values as output. The existing SOTA models are largely based on the Transformer architecture, modified with multiple encoding mechanisms to incorporate the context and semantics around the historical data. Inspired by the successes of pre-trained language foundation models, we pose a question about whether these models can also be adapted to solve time-series forecasting. Thus, we propose a new forecasting paradigm: prompt-based time series forecasting (PromptCast). In this novel task, the numerical input and output are transformed into prompts and the forecasting task is framed in a sentence-to-sentence manner, making it possible to directly apply language models for forecasting purposes. To support and facilitate the research of this task, we also present a large-scale dataset (PISA) that includes three real-world forecasting scenarios. We evaluate different SOTA numerical-based forecasting methods and language generation models. The benchmark results with various forecasting settings demonstrate the proposed PromptCast with language generation models is a promising research direction. Additionally, in comparison to conventional numerical-based forecasting, PromptCast shows a much better generalization ability under the zero-shot setting.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源