Bayesian optimistic Kullback–Leibler exploration
Autor: | Daniel D. Lee, Pedro A. Ortega, Kee-Eung Kim, Kanghoon Lee, Geon-Hyeong Kim |
---|---|
Rok vydání: | 2018 |
Předmět: |
Mathematical optimization
Kullback–Leibler divergence Relation (database) Computer science Bayesian probability 02 engineering and technology Statistics::Computation Set (abstract data type) Distribution (mathematics) Artificial Intelligence 020204 information systems 0202 electrical engineering electronic engineering information engineering Reinforcement learning 020201 artificial intelligence & image processing Divergence (statistics) Software SIMPLE algorithm |
Zdroj: | Machine Learning. 108:765-783 |
ISSN: | 1573-0565 0885-6125 |
DOI: | 10.1007/s10994-018-5767-4 |
Popis: | We consider a Bayesian approach to model-based reinforcement learning, where the agent uses a distribution of environment models to find the action that optimally trades off exploration and exploitation. Unfortunately, it is intractable to find the Bayes-optimal solution to the problem except for restricted cases. In this paper, we present BOKLE, a simple algorithm that uses Kullback–Leibler divergence to constrain the set of plausible models for guiding the exploration. We provide a formal analysis that this algorithm is near Bayes-optimal with high probability. We also show an asymptotic relation between the solution pursued by BOKLE and a well-known algorithm called Bayesian exploration bonus. Finally, we show experimental results that clearly demonstrate the exploration efficiency of the algorithm. |
Databáze: | OpenAIRE |
Externí odkaz: |