Popis: |
We analyze the dynamics of the Learning-Without-Recall model with Gaussian priors in a dynamic social network. Agents seeking to learn the state of the world, the "truth", exchange signals about their current beliefs across a changing network and update them accordingly. The agents are assumed memoryless and rational, meaning that they Bayes-update their beliefs based on current states and signals, with no other information from the past. The other assumption is that each agent hears a noisy signal from the truth at a frequency bounded away from zero. Under these conditions, we show that the system reaches truthful consensus almost surely with a convergence rate that is polynomial in expectation. Somewhat paradoxically, high outdegree can slow down the learning process. The lower-bound assumption on the truth-hearing frequency is necessary: even infinitely frequent access to the truth offers no guarantee of truthful consensus in the limit. |