Permutation Invariant Recurrent Neural Networks for Sound Source Tracking Applications

Autor: Diaz-Guerra, David, Politis, Archontis, Miguel, Antonio, Beltran, Jose R., Virtanen, Tuomas
Rok vydání: 2023
Předmět:
Druh dokumentu: Working Paper
DOI: 10.61782/fa.2023.1132
Popis: Many multi-source localization and tracking models based on neural networks use one or several recurrent layers at their final stages to track the movement of the sources. Conventional recurrent neural networks (RNNs), such as the long short-term memories (LSTMs) or the gated recurrent units (GRUs), take a vector as their input and use another vector to store their state. However, this approach results in the information from all the sources being contained in a single ordered vector, which is not optimal for permutation-invariant problems such as multi-source tracking. In this paper, we present a new recurrent architecture that uses unordered sets to represent both its input and its state and that is invariant to the permutations of the input set and equivariant to the permutations of the state set. Hence, the information of every sound source is represented in an individual embedding and the new estimates are assigned to the tracked trajectories regardless of their order.
Comment: Accepted for publication at Forum Acusticum 2023
Databáze: arXiv