Pseudo-Mallows for Efficient Probabilistic Preference Learning

Autor: Liu, Qinghua, Vitelli, Valeria, Mannino, Carlo, Frigessi, Arnoldo, Scheel, Ida
Rok vydání: 2022
Předmět:
Druh dokumentu: Working Paper
Popis: We propose the Pseudo-Mallows distribution over the set of all permutations of $n$ items, to approximate the posterior distribution with a Mallows likelihood. The Mallows model has been proven to be useful for recommender systems where it can be used to learn personal preferences from highly incomplete data provided by the users. Inference based on MCMC is however slow, preventing its use in real time applications. The Pseudo-Mallows distribution is a product of univariate discrete Mallows-like distributions, constrained to remain in the space of permutations. The quality of the approximation depends on the order of the $n$ items used to determine the factorization sequence. In a variational setting, we optimise the variational order parameter by minimising a marginalized KL-divergence. We propose an approximate algorithm for this discrete optimization, and conjecture a certain form of the optimal variational order that depends on the data. Empirical evidence and some theory support our conjecture. Sampling from the Pseudo-Mallows distribution allows fast preference learning, compared to alternative MCMC based options, when the data exists in form of partial rankings of the items or of clicking on some items. Through simulations and a real life data case study, we demonstrate that the Pseudo-Mallows model learns personal preferences very well and makes recommendations much more efficiently, while maintaining similar accuracy compared to the exact Bayesian Mallows model.
Comment: 45 pages, 11 Figures
Databáze: arXiv