Autor: |
Meng Zhu, Weidong Min, Qing Han, Guowei Zhan, Qiyan Fu, Jiahao Li |
Rok vydání: |
2022 |
DOI: |
10.21203/rs.3.rs-2205470/v1 |
Popis: |
Neural networks have achieved success in both computer vision and natural language processing. The existing state-of-the-art neural networks dedicate to increase model complexity for improving performance. However, this causes two problems. One is that these neural networks cannot be deployed in computationally limited platforms. The other is that these neural networks cannot meet real-time requirements for large resolution input. To alleviate the two problems, we propose a novel skip connection method called channel splitting-and-merging (CSM) connection, which is aimed to directly replace the residual connection method in neural networks while reducing parameters and running latency. By dissecting the channel shuffle connection method, it is found that channel splitting and channel merging operations efficiently reduce model complexity. Therefore, we introduce these two operations into the residual mapping for reducing model complexity. A range of tasks show that replacing the residual connection method with our CSM connection method effectively reduces parameters and FLOPs at a little cost of losing performance. The experimental results show that our CSM connection outperforms the channel shuffle connection method while utilizing less inferring time and equal number of parameters. |
Databáze: |
OpenAIRE |
Externí odkaz: |
|