Finite Littlestone Dimension Implies Finite Information Complexity

Autor: Pradeep, Aditya, Nachum, Ido, Gastpar, Michael
Předmět:
Popis: We prove that every online learnable class of functions of Littlestone dimension $d$ admits a learning algorithm with finite information complexity. Towards this end, we use the notion of a globally stable algorithm. Generally, the information complexity of such a globally stable algorithm is large yet finite, roughly exponential in $d$. We also show there is room for improvement; for a canonical online learnable class, indicator functions of affine subspaces of dimension $d$, the information complexity can be upper bounded logarithmically in $d$.
Databáze: OpenAIRE