Visual speech synthesis for speech perception experiments
Autor: | N. M. Brooke |
---|---|
Rok vydání: | 1982 |
Předmět: |
Motor theory of speech perception
Speech perception Acoustics and Ultrasonics Videotape Recording Computer science Acoustics Speech recognition Speech synthesis Speech processing computer.software_genre Arts and Humanities (miscellaneous) Transcription (linguistics) Neurocomputational speech processing computer |
Zdroj: | The Journal of the Acoustical Society of America. 71:S77-S77 |
ISSN: | 0001-4966 |
Popis: | Analytical investigations of speech perception in the audio‐visual domain require a visual stimulus that is plausibly lifelike, controllable and well‐specified. A computer package has been developed to produce real‐time animated graphics which simulate the front‐facial topography and articulatory movements of the lips and jaw during VCV speech utterances. It is highly modular and can simulate a wide range of facial features, shapes, and movements. It is currently driven by streams of time‐varying positional data obtained from experimental measurements of human speakers enunciating VCV utterances. The measurements of a series of point coordinates are made from sequential single frames of a videotape recording using a microprocessor‐linked data‐logging device. Corrections are made for the effects of global head and body movements. This is the lowest level of control in a hierarchy whose higher levels could include algorithms for generating the articulatory trajectories by rule from phonetic transcriptions. ... |
Databáze: | OpenAIRE |
Externí odkaz: |