Contrastive Modules with Temporal Attention for Multi-Task Reinforcement Learning

Autor: Lan, Siming, Zhang, Rui, Yi, Qi, Guo, Jiaming, Peng, Shaohui, Gao, Yunkai, Wu, Fan, Chen, Ruizhi, Du, Zidong, Hu, Xing, Zhang, Xishan, Li, Ling, Chen, Yunji
Rok vydání: 2023
Předmět:
Druh dokumentu: Working Paper
Popis: In the field of multi-task reinforcement learning, the modular principle, which involves specializing functionalities into different modules and combining them appropriately, has been widely adopted as a promising approach to prevent the negative transfer problem that performance degradation due to conflicts between tasks. However, most of the existing multi-task RL methods only combine shared modules at the task level, ignoring that there may be conflicts within the task. In addition, these methods do not take into account that without constraints, some modules may learn similar functions, resulting in restricting the model's expressiveness and generalization capability of modular methods. In this paper, we propose the Contrastive Modules with Temporal Attention(CMTA) method to address these limitations. CMTA constrains the modules to be different from each other by contrastive learning and combining shared modules at a finer granularity than the task level with temporal attention, alleviating the negative transfer within the task and improving the generalization ability and the performance for multi-task RL. We conducted the experiment on Meta-World, a multi-task RL benchmark containing various robotics manipulation tasks. Experimental results show that CMTA outperforms learning each task individually for the first time and achieves substantial performance improvements over the baselines.
Comment: This paper has been accepted at NeurIPS 2023 as a poster
Databáze: arXiv