Assessing Phrase Break of ESL Speech with Pre-trained Language Models and Large Language Models

Autor: Wang, Zhiyi, Mao, Shaoguang, Wu, Wenshan, Xia, Yan, Deng, Yan, Tien, Jonathan
Rok vydání: 2023
Předmět:
Druh dokumentu: Working Paper
Popis: This work introduces approaches to assessing phrase breaks in ESL learners' speech using pre-trained language models (PLMs) and large language models (LLMs). There are two tasks: overall assessment of phrase break for a speech clip and fine-grained assessment of every possible phrase break position. To leverage NLP models, speech input is first force-aligned with texts, and then pre-processed into a token sequence, including words and phrase break information. To utilize PLMs, we propose a pre-training and fine-tuning pipeline with the processed tokens. This process includes pre-training with a replaced break token detection module and fine-tuning with text classification and sequence labeling. To employ LLMs, we design prompts for ChatGPT. The experiments show that with the PLMs, the dependence on labeled training data has been greatly reduced, and the performance has improved. Meanwhile, we verify that ChatGPT, a renowned LLM, has potential for further advancement in this area.
Comment: Accepted by InterSpeech 2023. arXiv admin note: substantial text overlap with arXiv:2210.16029
Databáze: arXiv