Bandit Social Learning: Exploration under Myopic Behavior

Autor: Banihashem, Kiarash, Hajiaghayi, MohammadTaghi, Shin, Suho, Slivkins, Aleksandrs
Rok vydání: 2023
Předmět:
Druh dokumentu: Working Paper
Popis: We study social learning dynamics motivated by reviews on online platforms. The agents collectively follow a simple multi-armed bandit protocol, but each agent acts myopically, without regards to exploration. We allow a wide range of myopic behaviors that are consistent with (parameterized) confidence intervals for the arms' expected rewards. We derive stark learning failures for any such behavior, and provide matching positive results. As a special case, we obtain the first general results on failure of the greedy algorithm in bandits, thus providing a theoretical foundation for why bandit algorithms should explore.
Comment: Extended version of NeurIPS 2023 paper titled "Bandit Social Learning under Myopic Behavior"
Databáze: arXiv