Learning Discriminative Alpha-Beta-divergence for Positive Definite Matrices (Extended Version)
Autor: | Cherian, Anoop, Stanitsas, Panagiotis, Harandi, Mehrtash, Morellas, Vassilios, Papanikolopoulos, Nikolaos |
---|---|
Rok vydání: | 2017 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | Symmetric positive definite (SPD) matrices are useful for capturing second-order statistics of visual data. To compare two SPD matrices, several measures are available, such as the affine-invariant Riemannian metric, Jeffreys divergence, Jensen-Bregman logdet divergence, etc.; however, their behaviors may be application dependent, raising the need of manual selection to achieve the best possible performance. Further and as a result of their overwhelming complexity for large-scale problems, computing pairwise similarities by clever embedding of SPD matrices is often preferred to direct use of the aforementioned measures. In this paper, we propose a discriminative metric learning framework, Information Divergence and Dictionary Learning (IDDL), that not only learns application specific measures on SPD matrices automatically, but also embeds them as vectors using a learned dictionary. To learn the similarity measures (which could potentially be distinct for every dictionary atom), we use the recently introduced alpha-beta-logdet divergence, which is known to unify the measures listed above. We propose a novel IDDL objective, that learns the parameters of the divergence and the dictionary atoms jointly in a discriminative setup and is solved efficiently using Riemannian optimization. We showcase extensive experiments on eight computer vision datasets, demonstrating state-of-the-art performances. Comment: Accepted at the International Conference on Computer Vision (ICCV) |
Databáze: | arXiv |
Externí odkaz: |