The staircase property: How hierarchical structure can guide deep learning

Autor: Abbe, Emmanuel, Boix-Adsera, Enric, Brennan, Matthew, Bresler, Guy, Nagaraj, Dheeraj
Rok vydání: 2021
Předmět:
Druh dokumentu: Working Paper
Popis: This paper identifies a structural property of data distributions that enables deep neural networks to learn hierarchically. We define the "staircase" property for functions over the Boolean hypercube, which posits that high-order Fourier coefficients are reachable from lower-order Fourier coefficients along increasing chains. We prove that functions satisfying this property can be learned in polynomial time using layerwise stochastic coordinate descent on regular neural networks -- a class of network architectures and initializations that have homogeneity properties. Our analysis shows that for such staircase functions and neural networks, the gradient-based algorithm learns high-level features by greedily combining lower-level features along the depth of the network. We further back our theoretical results with experiments showing that staircase functions are also learnable by more standard ResNet architectures with stochastic gradient descent. Both the theoretical and experimental results support the fact that staircase properties have a role to play in understanding the capabilities of gradient-based learning on regular networks, in contrast to general polynomial-size networks that can emulate any SQ or PAC algorithms as recently shown.
Comment: 60 pages, accepted to NeurIPS '21
Databáze: arXiv