MTP: Advancing Remote Sensing Foundation Model via Multitask Pretraining

Autor: Di Wang, Jing Zhang, Minqiang Xu, Lin Liu, Dongsheng Wang, Erzhong Gao, Chengxi Han, Haonan Guo, Bo Du, Dacheng Tao, Liangpei Zhang
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, Vol 17, Pp 11632-11654 (2024)
Druh dokumentu: article
ISSN: 1939-1404
2151-1535
DOI: 10.1109/JSTARS.2024.3408154
Popis: Foundation models have reshaped the landscape of remote sensing (RS) by enhancing various image interpretation tasks. Pretraining is an active research topic, encompassing supervised and self-supervised learning methods to initialize model weights effectively. However, transferring the pretrained models to downstream tasks may encounter task discrepancy due to their formulation of pretraining as image classification or object discrimination tasks. In this study, we explore the multitask pretraining (MTP) paradigm for RS foundation models to address this issue. Using a shared encoder and task-specific decoder architecture, we conduct multitask supervised pretraining on the segment anything model annotated remote sensing segmentation dataset, encompassing semantic segmentation, instance segmentation, and rotated object detection. MTP supports both convolutional neural networks and vision transformer foundation models with over 300 million parameters. The pretrained models are finetuned on various RS downstream tasks, such as scene classification, horizontal, and rotated object detection, semantic segmentation, and change detection. Extensive experiments across 14 datasets demonstrate the superiority of our models over existing ones of similar size and their competitive performance compared to larger state-of-the-art models, thus validating the effectiveness of MTP.
Databáze: Directory of Open Access Journals