Exploring the power of pure attention mechanisms in blind room parameter estimation

Autor: Chunxi Wang, Maoshen Jia, Meiran Li, Changchun Bao, Wenyu Jin
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: EURASIP Journal on Audio, Speech, and Music Processing, Vol 2024, Iss 1, Pp 1-18 (2024)
Druh dokumentu: article
ISSN: 1687-4722
DOI: 10.1186/s13636-024-00344-8
Popis: Abstract Dynamic parameterization of acoustic environments has drawn widespread attention in the field of audio processing. Precise representation of local room acoustic characteristics is crucial when designing audio filters for various audio rendering applications. Key parameters in this context include reverberation time (RT $$_{60}$$ 60 ) and geometric room volume. In recent years, neural networks have been extensively applied in the task of blind room parameter estimation. However, there remains a question of whether pure attention mechanisms can achieve superior performance in this task. To address this issue, this study employs blind room parameter estimation based on monaural noisy speech signals. Various model architectures are investigated, including a proposed attention-based model. This model is a convolution-free Audio Spectrogram Transformer, utilizing patch splitting, attention mechanisms, and cross-modality transfer learning from a pretrained Vision Transformer. Experimental results suggest that the proposed attention mechanism-based model, relying purely on attention mechanisms without using convolution, exhibits significantly improved performance across various room parameter estimation tasks, especially with the help of dedicated pretraining and data augmentation schemes. Additionally, the model demonstrates more advantageous adaptability and robustness when handling variable-length audio inputs compared to existing methods.
Databáze: Directory of Open Access Journals