Popis: |
As experiments advance to record from tens of thousands of neurons, statistical physics provides a framework for understanding how collective activity emerges from networks of fine-scale correlations. While modeling these populations is tractable in loop-free networks, neural circuitry inherently contains feedback loops of connectivity. Here, for a class of networks with loops, we present an exact solution to the maximum entropy problem that scales to very large systems. This solution provides direct access to information-theoretic measures like the entropy of the model and the information contained in correlations, which are usually inaccessible at large scales. In turn, this allows us to search for the optimal network of correlations that contains the maximum information about population activity. Applying these methods to 45 recordings of approximately 10,000 neurons in the mouse visual system, we demonstrate that our framework captures more information -- providing a better description of the population -- than existing methods without loops. For a given population, our models perform even better during visual stimulation than spontaneous activity; however, the inferred interactions overlap significantly, suggesting an underlying neural circuitry that remains consistent across stimuli. Generally, we construct an optimized framework for studying the statistical physics of large neural populations, with future applications extending to other biological networks. |