Random geometric graphs in high dimension
Autor: | Vittorio Erba, Sebastiano Ariosto, Marco Gherardi, Pietro Rotondo |
---|---|
Rok vydání: | 2020 |
Předmět: |
Discrete mathematics
Statistical Mechanics (cond-mat.stat-mech) Null model Computation Nonlinear dimensionality reduction FOS: Physical sciences Observable Disordered Systems and Neural Networks (cond-mat.dis-nn) Condensed Matter - Disordered Systems and Neural Networks 01 natural sciences 010305 fluids & plasmas Data set Spatial network Dimensional reduction 0103 physical sciences Leverage (statistics) 010306 general physics Condensed Matter - Statistical Mechanics Mathematics |
Zdroj: | Physical review. E. 102(1-1) |
ISSN: | 2470-0053 |
Popis: | Many machine learning algorithms used for dimensional reduction and manifold learning leverage on the computation of the nearest neighbours to each point of a dataset to perform their tasks. These proximity relations define a so-called geometric graph, where two nodes are linked if they are sufficiently close to each other. Random geometric graphs, where the positions of nodes are randomly generated in a subset of $\mathbb{R}^{d}$, offer a null model to study typical properties of datasets and of machine learning algorithms. Up to now, most of the literature focused on the characterization of low-dimensional random geometric graphs whereas typical datasets of interest in machine learning live in high-dimensional spaces ($d \gg 10^{2}$). In this work, we consider the infinite dimensions limit of hard and soft random geometric graphs and we show how to compute the average number of subgraphs of given finite size $k$, e.g. the average number of $k$-cliques. This analysis highlights that local observables display different behaviors depending on the chosen ensemble: soft random geometric graphs with continuous activation functions converge to the naive infinite dimensional limit provided by Erd\"os-R\'enyi graphs, whereas hard random geometric graphs can show systematic deviations from it. We present numerical evidence that our analytical insights, exact in infinite dimensions, provide a good approximation also for dimension $d\gtrsim10$. |
Databáze: | OpenAIRE |
Externí odkaz: |