Autor: |
Chuang Lin, Xinyue Niu, Jun Zhang, Xianping Fu |
Jazyk: |
angličtina |
Rok vydání: |
2023 |
Předmět: |
|
Zdroj: |
Applied Sciences, Vol 13, Iss 19, p 11071 (2023) |
Druh dokumentu: |
article |
ISSN: |
2076-3417 |
DOI: |
10.3390/app131911071 |
Popis: |
Hand motion intentions can be detected by analyzing the surface electromyographic (sEMG) signals obtained from the remaining forearm muscles of trans-radial amputees. This technology sheds new light on myoelectric prosthesis control; however, fewer signals from amputees can be collected in clinical practice. The collected signals can further suffer from quality deterioration due to the muscular atrophy of amputees, which significantly decreases the accuracy of hand motion intention recognition. To overcome these problems, this work proposed a transfer learning strategy combined with a long-exposure-CNN (LECNN) model to improve the amputees’ hand motion intention recognition accuracy. Transfer learning can leverage the knowledge acquired from intact-limb subjects to amputees, and LECNN can effectively capture the information in the sEMG signals. Two datasets with 20 intact-limb and 11 amputated-limb subjects from the Ninapro database were used to develop and evaluate the proposed method. The experimental results demonstrated that the proposed transfer learning strategy significantly improved the recognition performance (78.1%±19.9%, p-value < 0.005) compared with the non-transfer case (73.4%±20.8%). When the source and target data matched well, the after-transfer accuracy could be improved by up to 8.5%. Compared with state-of-the-art methods in two previous studies, the average accuracy was improved by 11.6% (from 67.5% to 78.1%, p-value < 0.005) and 12.1% (from 67.0% to 78.1%, p-value < 0.005). This result is also among the best from the contrast methods. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|