Autor: |
YILDIZ, İLKAY, DY, JENNIFER, ERDOĞMUŞ, DENİZ, OSTMO, SUSAN, CAMPBELL, J. PETER, CHIANG, MICHAEL F., IOANNIDIS, STRATIS |
Předmět: |
|
Zdroj: |
ACM Transactions on Knowledge Discovery from Data; Dec2022, Vol. 16 Issue 6, p1-38, 38p |
Abstrakt: |
We study the problem of ranking regression, in which a dataset of rankings is used to learn Plackett-Luce scores as functions of sample features. We propose a novel spectral algorithm to accelerate learning in ranking regression. Our main technical contribution is to show that the Plackett-Luce negative log-likelihood augmented with a proximal penalty has stationary points that satisfy the balance equations of a Markov Chain. This allows us to tackle the ranking regression problem via an efficient spectral algorithm by using the Alternating Directions Method of Multipliers (ADMM). ADMM separates the learning of scores and model parameters, and in turn, enables us to devise fast spectral algorithms for ranking regression via both shallow and deep neural network (DNN) models. For shallow models, our algorithms are up to 579 times faster than the Newton's method. For DNN models, we extend the standard ADMM via a Kullback-Leibler proximal penalty and show that this is still amenable to fast inference via a spectral approach. Compared to a state-of-the-art siamese network, our resulting algorithms are up to 175 times faster and attain better predictions by up to 26% Top-1 Accuracy and 6% Kendall-Tau correlation over five real-life ranking datasets. [ABSTRACT FROM AUTHOR] |
Databáze: |
Complementary Index |
Externí odkaz: |
|