Retrieving Semantics from the Deep: an RAG Solution for Gesture Synthesis

Autor: Mughal, M. Hamza, Dabral, Rishabh, Scholman, Merel C. J., Demberg, Vera, Theobalt, Christian
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: Non-verbal communication often comprises of semantically rich gestures that help convey the meaning of an utterance. Producing such semantic co-speech gestures has been a major challenge for the existing neural systems that can generate rhythmic beat gestures, but struggle to produce semantically meaningful gestures. Therefore, we present RAG-Gesture, a diffusion-based gesture generation approach that leverages Retrieval Augmented Generation (RAG) to produce natural-looking and semantically rich gestures. Our neuro-explicit gesture generation approach is designed to produce semantic gestures grounded in interpretable linguistic knowledge. We achieve this by using explicit domain knowledge to retrieve exemplar motions from a database of co-speech gestures. Once retrieved, we then inject these semantic exemplar gestures into our diffusion-based gesture generation pipeline using DDIM inversion and retrieval guidance at the inference time without any need of training. Further, we propose a control paradigm for guidance, that allows the users to modulate the amount of influence each retrieval insertion has over the generated sequence. Our comparative evaluations demonstrate the validity of our approach against recent gesture generation approaches. The reader is urged to explore the results on our project page.
Comment: Preprint. Project page: https://vcai.mpi-inf.mpg.de/projects/RAG-Gesture/
Databáze: arXiv