Research on the Correlation Between the Timbre Attributes of Musical Sound and Visual Color
Autor: | Jingyu Liu, Hui Ren, Anni Zhao, Yiyang Li, Shuang Wang |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2021 |
Předmět: |
General Computer Science
Artificial neural network business.industry Experimental psychology General Engineering Information processing Experimental data Pattern recognition timbre-color correlation model Visualization Timbre color TK1-9971 Visual processing Correlation audiovisual cross-modality timbre feature extraction General Materials Science Artificial intelligence Electrical engineering. Electronics. Nuclear engineering Electrical and Electronic Engineering business |
Zdroj: | IEEE Access, Vol 9, Pp 97855-97877 (2021) |
ISSN: | 2169-3536 |
Popis: | This article investigated the timbre of musical sound and visual color; comprehensively used experimental psychology, audio information processing and other experimental methods and technical means; and conducted research on the relationship between timbre attributes and visual color. First, we designed and implemented a subjective perception experiment based on audiovisual cross-modal timbre-color correlation. Through the statistical analysis of the experimental data, we obtained a complete timbre-color cross-modal correlation dataset. Second, through the visual processing and correlation analysis of the timbre-color correlation data, it is proven that there is a certain correlation between the timbre dimension and the color dimension. On this basis, we used three algorithms to construct a timbre-color correlation model, namely, multiple linear regression, BP neural network and SVR, and we verified the accuracy of the three models. The timbre-color correlation dataset constructed in this paper can provide basic data support for audiovisual cross-modal research. The timbre-color correlation model constructed in this paper can provide a theoretical basis for cross-modal audiovisual applications. In addition, the timbre-color cross-modality research method in this paper can provide new research ideas for audio-visual cross-modality research. |
Databáze: | OpenAIRE |
Externí odkaz: |