Inferring Feature Relevances From Metric Learning
Autor: | Bassam Mokbel, Alexander Schulz, Barbara Hammer, Michael Biehl |
---|---|
Přispěvatelé: | Intelligent Systems |
Jazyk: | angličtina |
Rok vydání: | 2015 |
Předmět: |
Eigenvalues and eigenfunctions
Measurement business.industry Computer science Covariance matrices Feature extraction Pattern recognition computer.software_genre Electronic mail k-nearest neighbors algorithm Correlation Feature (computer vision) Metric (mathematics) Prototypes Relevance (information retrieval) Artificial intelligence Data mining business computer Curse of dimensionality Interpretability |
Zdroj: | Computational Intelligence, 2015 IEEE Symposium Series on, 1599-1606 STARTPAGE=1599;ENDPAGE=1606;TITLE=Computational Intelligence, 2015 IEEE Symposium Series on SSCI |
DOI: | 10.1109/SSCI.2015.225 |
Popis: | Powerful metric learning algorithms have been proposed in the last years which do not only greatly enhance the accuracy of distance-based classifiers and nearest neighbor database retrieval, but which also enable the interpretability of these operations by assigning explicit relevance weights to the single data components. Starting with the work SSCI13Stretal, it has been noticed, however, that this procedure has very limited validity in the important case of high data dimensionality or high feature correlations: the resulting relevance profiles are random to a large extend, leading to invalid interpretation and fluctuations of its accuracy for novel data. While the work SSCI13Stretal proposes a first cure by means of L2-regularisation, it only preserves strongly relevant features, leaving weakly relevant and not necessarily unique features undetected. In this contribution, we enhance the technique by an efficient linear programming scheme which enables the unique identification of a relevance interval for every observed feature, this way identifying both, strongly and weakly relevant features for a given metric. |
Databáze: | OpenAIRE |
Externí odkaz: |