Autor: |
Qiang Li, Jiwei Qin, Daishun Cui, Dezhi Sun, Dacheng Wang |
Jazyk: |
angličtina |
Rok vydání: |
2024 |
Předmět: |
|
Zdroj: |
Journal of Big Data, Vol 11, Iss 1, Pp 1-21 (2024) |
Druh dokumentu: |
article |
ISSN: |
2196-1115 |
DOI: |
10.1186/s40537-024-01001-9 |
Popis: |
Abstract Transformer-based methods have achieved excellent results in the field of time series forecasting due to their powerful ability to model sequences and capture their long-term dependencies. However, Transformer-based methods are still computationally inefficient and have high costs when modeling long sequences. In addition, previous work ignored the destruction of the original in-channel features by too many mixed features when performing channel mixing. To address the above issues, we introduce Mamba into the field of time series forecasting. The selective state-space model Mamba models long sequences more efficiently by expanding linearly according to the sequence length. Based on this, we propose Channel Mixing Mamba (CMMamba), which uses bidirectional Mamba to model sequences and uses a channel mixing mechanism to select appropriate mixed features to enhance the feature representation within the original channel. By introducing Mamba, we achieve efficient modeling of sequences. We also propose Deep Convolutional Structure (DCS) to learn cross-channel dependencies and temporal order information. Extensive experimental results on six real-world public datasets demonstrate the effectiveness of CMMamba. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|