Disentangled Acoustic Fields For Multimodal Physical Scene Understanding

Autor: Yin, Jie, Luo, Andrew, Du, Yilun, Cherian, Anoop, Marks, Tim K., Roux, Jonathan Le, Gan, Chuang
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: We study the problem of multimodal physical scene understanding, where an embodied agent needs to find fallen objects by inferring object properties, direction, and distance of an impact sound source. Previous works adopt feed-forward neural networks to directly regress the variables from sound, leading to poor generalization and domain adaptation issues. In this paper, we illustrate that learning a disentangled model of acoustic formation, referred to as disentangled acoustic field (DAF), to capture the sound generation and propagation process, enables the embodied agent to construct a spatial uncertainty map over where the objects may have fallen. We demonstrate that our analysis-by-synthesis framework can jointly infer sound properties by explicitly decomposing and factorizing the latent space of the disentangled model. We further show that the spatial uncertainty map can significantly improve the success rate for the localization of fallen objects by proposing multiple plausible exploration locations.
Databáze: arXiv