Representation Learning with Weighted Inner Product for Universal Approximation of General Similarities
Autor: | Akifumi Okuno, Kazuki Fukui, Geewook Kim, Hidetoshi Shimodaira |
---|---|
Rok vydání: | 2019 |
Předmět: |
Discrete mathematics
FOS: Computer and information sciences Computer Science - Machine Learning 020203 distributed computing Artificial neural network Graph embedding Cosine similarity 020206 networking & telecommunications Machine Learning (stat.ML) 02 engineering and technology Positive-definite matrix Similitude Machine Learning (cs.LG) Similarity (network science) Statistics - Machine Learning Product (mathematics) 0202 electrical engineering electronic engineering information engineering Feature learning Mathematics |
Zdroj: | IJCAI |
DOI: | 10.48550/arxiv.1902.10409 |
Popis: | We propose $\textit{weighted inner product similarity}$ (WIPS) for neural network-based graph embedding. In addition to the parameters of neural networks, we optimize the weights of the inner product by allowing positive and negative values. Despite its simplicity, WIPS can approximate arbitrary general similarities including positive definite, conditionally positive definite, and indefinite kernels. WIPS is free from similarity model selection, since it can learn any similarity models such as cosine similarity, negative Poincar\'e distance and negative Wasserstein distance. Our experiments show that the proposed method can learn high-quality distributed representations of nodes from real datasets, leading to an accurate approximation of similarities as well as high performance in inductive tasks. Comment: 8 pages, 2 figures, IJCAI 2019 |
Databáze: | OpenAIRE |
Externí odkaz: |