Zobrazeno 1 - 10
of 516
pro vyhledávání: '"Liu Chenghao"'
Time series forecasting aids decision-making, especially for stakeholders who rely on accurate predictions, making it very important to understand and explain these models to ensure informed decisions. Traditional explainable AI (XAI) methods, which
Externí odkaz:
http://arxiv.org/abs/2410.14180
Autor:
Liu, Xu, Liu, Juncheng, Woo, Gerald, Aksu, Taha, Liang, Yuxuan, Zimmermann, Roger, Liu, Chenghao, Savarese, Silvio, Xiong, Caiming, Sahoo, Doyen
Time series foundation models have demonstrated impressive performance as zero-shot forecasters. However, achieving effectively unified training on time series remains an open challenge. Existing approaches introduce some level of model specializatio
Externí odkaz:
http://arxiv.org/abs/2410.10469
Autor:
Aksu, Taha, Woo, Gerald, Liu, Juncheng, Liu, Xu, Liu, Chenghao, Savarese, Silvio, Xiong, Caiming, Sahoo, Doyen
Time series foundation models excel in zero-shot forecasting, handling diverse tasks without explicit training. However, the advancement of these models has been hindered by the lack of comprehensive benchmarks. To address this gap, we introduce the
Externí odkaz:
http://arxiv.org/abs/2410.10393
Autor:
Rector-Brooks, Jarrid, Hasan, Mohsin, Peng, Zhangzhi, Quinn, Zachary, Liu, Chenghao, Mittal, Sarthak, Dziri, Nouha, Bronstein, Michael, Bengio, Yoshua, Chatterjee, Pranam, Tong, Alexander, Bose, Avishek Joey
Generative modeling of discrete data underlies important applications spanning text-based agents like ChatGPT to the design of the very building blocks of life in protein sequences. However, application domains need to exert control over the generate
Externí odkaz:
http://arxiv.org/abs/2410.08134
Foundation models have emerged as a promising approach in time series forecasting (TSF). Existing approaches either repurpose large language models (LLMs) or build large-scale time series datasets to develop TSF foundation models for universal foreca
Externí odkaz:
http://arxiv.org/abs/2408.17253
Autor:
Liu, Juncheng, Liu, Chenghao, Woo, Gerald, Wang, Yiwei, Hooi, Bryan, Xiong, Caiming, Sahoo, Doyen
Transformer-based models have emerged as powerful tools for multivariate time series forecasting (MTSF). However, existing Transformer models often fall short of capturing both intricate dependencies across variate and temporal dimensions in MTS data
Externí odkaz:
http://arxiv.org/abs/2406.04975
Unlike natural language processing and computer vision, the development of Foundation Models (FMs) for time series forecasting is blocked due to data scarcity. While recent efforts are focused on building such FMs by unlocking the potential of langua
Externí odkaz:
http://arxiv.org/abs/2405.14252
Autor:
Yang, Yiyuan, Jin, Ming, Wen, Haomin, Zhang, Chaoli, Liang, Yuxuan, Ma, Lintao, Wang, Yi, Liu, Chenghao, Yang, Bin, Xu, Zenglin, Bian, Jiang, Pan, Shirui, Wen, Qingsong
The study of time series is crucial for understanding trends and anomalies over time, enabling predictive insights across various sectors. Spatio-temporal data, on the other hand, is vital for analyzing phenomena in both space and time, providing a d
Externí odkaz:
http://arxiv.org/abs/2404.18886
PEMT: Multi-Task Correlation Guided Mixture-of-Experts Enables Parameter-Efficient Transfer Learning
Parameter-efficient fine-tuning (PEFT) has emerged as an effective method for adapting pre-trained language models to various tasks efficiently. Recently, there has been a growing interest in transferring knowledge from one or multiple tasks to the d
Externí odkaz:
http://arxiv.org/abs/2402.15082
Time series analysis and modelling constitute a crucial research area. Traditional artificial neural networks struggle with complex, non-stationary time series data due to high computational complexity, limited ability to capture temporal information
Externí odkaz:
http://arxiv.org/abs/2402.05423